NHacker Next
login
▲Safe C++ proposal is not being continuedsibellavia.lol
160 points by charles_irl 16 hours ago | 143 comments
Loading comments...
tialaramex 14 hours ago [-]
I am actually much more pessimistic about Profiles than Simone.

Regardless of the technology the big thing Rust has that C++ does not is safety culture, and that's dominant here. You could also see at the 2024 "Fireside chat" at CppCon that this isn't likely to change any time soon.

The profiles technology isn't very good. But that's insignificant next to the culture problem, once you decided to make the fifteen minute bagpipe dirge your lead single it doesn't really matter whether you use the colored vinyl.

AlotOfReading 13 hours ago [-]
It doesn't show up in the online videos, but there was a huge contingent of people at that fireside chat wanting a reasonable safety story for C++. The committee simply doesn't have representation from those people and don't seem to understand why it's an existential risk to the language community. The delivery timelines are so long here that anything not standardized soon isn't going to arrive for a decade or more. That's all the time in the world for Rust (or even Zig) to break down the remaining barriers.

Profiles and sanitizers just aren't sufficient.

IshKebab 13 hours ago [-]
Yeah because the committee is now people that a) really love C++, and b) don't care enough about safety to use Rust instead.

I think there are plenty of people that must use C++ due to legacy, management or library reasons and they care about safety. But those people aren't going to join language committees.

john_the_writer 53 minutes ago [-]
I don't know that they don't care about safety. They just don't agree with the definition others have picked. I remember when managed code became a thing. I being an old c++ dev noted that memory was always managed. It was managed by me.
WalterBright 9 hours ago [-]
D adds a lot to memory safety without needing to struggle with program redesigns that Rust requires.

These include:

1. bounds checked arrays (you can still use raw pointers instead if you like)

2. default initialization

3. static checks for escaping pointers

4. optional use of pure functions

5. transitive const and immutable qualifiers

6. ranges based on slices rather than pointer pairs

IshKebab 4 hours ago [-]
I think D failed to gain widespread traction for other reasons though:

1. The use of garbage collection. If you accept GC there are many other languages you can use. If you don't want GC the only realistic option was C++. Rust doesn't rely on GC.

IIRC GC in D is optional in some way, but the story always felt murky to me and that always felt like a way of weaseling out of that problem - like if I actually started writing D I'd find all the libraries needed GC anyway.

2. The awkward standard library schism.

3. Small community compared to C++. I think it probably just didn't offer enough to overcome this, whereas Rust did. Rust also had the help of backing from a large organisation.

I don't recall anyone ever mentioning its improved safety. I had a look on Algolia back through HN and most praise is about metaprogramming or it being generally more modern and sane than C++. I couldn't find a single mention of anything to do with safety or anything on your list.

Whereas Rust shouts safety from the rooftops. Arguably too much!

WalterBright 3 hours ago [-]
Using D does not require a garbage collector. You can use it, or not, and you can use the GC for some allocations, and use other methods for other allocations.

D has a lot of very useful features. Memory safety features are just one aspect of it.

> The awkward standard library schism.

???

Don't underestimate the backing of a large and powerful organization.

IshKebab 25 minutes ago [-]
> You can use it, or not, and you can use the GC for some allocations, and use other methods for other allocations.

Yes but people wanted a language where you can't use GC.

> ???

"Which standard library should I use?" is not a question most languages have:

https://stackoverflow.com/q/693672/265521

Surely... you were aware of this problem? Maybe I misunderstood the "???".

> Don't underestimate the backing of a large and powerful organization.

Yeah it definitely matters a lot. I don't think Go would have been remotely as successful as it has been without Google.

But also we shouldn't overstate it. It definitely helped Rust to have Mozilla, but Mozilla isn't nearly as large and powerful as Google. The fact that it is an excellent language with generally fantastic ergonomics and first-of-its-kind practical memory safety without GC... probably more important. (Of course you could argue it wouldn't have got to that point without Mozilla.)

pjmlp 4 hours ago [-]
However D still needs the ecosystem and support from platform vendors.

Unfortunately that was already lost, Java/Kotlin, Go, C# and Swift are the platform holder darlings for safe languages with GC, being improved for low level programming on each release, many with feature that you may argue that were in D first, and Rust for everything else.

Microsoft recently announced first class support for writing drivers in Rust, while I am certain that NVidia might be supportive of future Rust support on CUDA, after they get their new Python cu tiles support going across the ecosystem.

Two examples out of many others.

Language is great systems programming language, what is missing is the rest of the owl.

Voultapher 5 hours ago [-]
> With program redesigns that Rust requires

Why does Rust sometimes require program redesigns? Because these programs are flawed at some fundamental level. D lacks the most important and hardest kind of safety and that is reference safety - curiously C++ profiles also lacks any solution to that problem. A significant amount of production C++ code is riddled with UB and will never be made safe by repainting it and bounds checking.

Claiming that not being forced to fix something fundamentally broken is an advantage when talking about safety doesn't make you look like a particularly serious advocate for the topic.

WalterBright 5 hours ago [-]
> Why does Rust sometimes require program redesigns? Because these programs are flawed at some fundamental level.

I'm familiar with borrow checkers, as I wrote one for D.

Not following the rules of the borrow checker does not mean the program is flawed or incorrect. It just means the borrow checker is unable to prove it correct.

> D lacks the most important and hardest kind of safety and that is reference safety

I look at compilations of programming safety errors in shipped code now and then. Far and away the #1 bug is out-of-bounds array access. D has solved that problem.

BTW, if you use the optional GC in D, the program will be memory safe. No borrow checker needed.

Voultapher 5 hours ago [-]
> I look at compilations of programming safety errors in shipped code now and then. Far and away the #1 bug is out-of-bounds array access. D has solved that problem.

Do you have good data on that? Looking at the curl and Chromium reports they show that use-after-free is their most recurring and problematic issue.

I'm sure you are aware, but I want to mention this here for other readers. Reference safety extends to things like iterators and slices in C++.

> Not following the rules of the borrow checker does not mean the program is flawed or incorrect.

At a scale of 100k+ LoC every single measured program has been shown to be flawed because of it.

WalterBright 3 hours ago [-]
No, I haven't kept track of the reports I've seen. They all had array bounds as the #1 error encountered in shipped code.

Edit: I just googled "causes of memory safety bugs in C++". Number 1 answer: "Buffer Overflows/Out-of-Bounds Access"

"Undefined behavior in C/C++ code leads to security flaws like buffer overflows" https://www.trust-in-soft.com/resources/blogs/memory-safety-...

"Some common types of memory safety bugs include: Buffer overflows" https://www.code-intelligence.com/blog/memory_safety_corrupt...

"Memory Safety Vulnerabilities 3.1. Buffer overflow vulnerabilities We’ll start our discussion of vulnerabilities with one of the most common types of errors — buffer overflow (also called buffer overrun) vulnerabilities. Buffer overflow vulnerabilities are a particular risk in C, and since C is an especially widely used systems programming language, you might not be surprised to hear that buffer overflows are one of the most pervasive kind of implementation flaws around." https://textbook.cs161.org/memory-safety/vulnerabilities.htm...

Voultapher 3 hours ago [-]
Spatial safety can be achieved exhaustively with a single compiler switch - in clang - and a minor performance hit. Temporal safety is much harder and requires software redesign, that's why it still remains in projects that care about memory-safety and try over a long time to weed out all instances of UB, i.e. critical software like curl, Linux and Chromium.

Temporal safety is usually also much harder to reason about for humans, since it requires more context.

IshKebab 21 minutes ago [-]
What flag is that? Address sanitizer has a 2x performance hit so presumably not that?
wolvesechoes 2 hours ago [-]
> Why does Rust sometimes require program redesigns? Because these programs are flawed at some fundamental level.

Simply not true, and this stance is one of the reasons we have people talking about Rust sect.

drysine 12 minutes ago [-]
> Because these programs are flawed at some fundamental level.

That's a very strong statement. How do you support it with arguments?

akoboldfrying 2 hours ago [-]
> Because these programs are flawed at some fundamental level.

No. Programs that pass borrow checking are a strict subset of programs that are correct with respect to memory allocation; an infinite number of correct programs do not pass it. The borrow checker is a good idea, but it's (necessarily) incomplete.

Your claim is like saying that a program that uses any kind of dynamic memory allocation at all is fundamentally broken.

shawn_w 7 hours ago [-]
They could also care about safety but just not like the Rust approach.
AlotOfReading 6 hours ago [-]
This was asked at the aforementioned chat. Andreas Weis (MISRA) responded along the lines of "You shouldn't be writing new code in C++ if you want guarantees". Might not have the identity correct, my notes aren't in front of me.
john_the_writer 52 minutes ago [-]
Love this quote. An love the intent.
Voultapher 5 hours ago [-]
I came to the same conclusion in a talk I gave to the Munich C++ Meetup [1]. There is a prevalent culture of expecting users to not make mistakes. Library constructs that could be significantly safer-to-use are kept easy-to-use incorrectly, usually with the argument of performance. The irony is that if you look closer, the performance optimization that was done is removing the seatbelts from a small hatchback to safe on weight.

[1] https://youtu.be/rZ7QQWKP8Rk or text form https://github.com/Voultapher/Presentations/blob/main/safety...

vintagedave 4 hours ago [-]
A year or so ago I read that there was a design decision railroaded through the committee about what kind of safety approach could be looked at. Its wording effectively prevented Safe C++. I was not at this meeting so I’m going on what others say:

https://www.reddit.com/r/cpp/comments/1hppdzc/comment/m4jjo4...

I’m a big fan of Safe C++ and believe its approach — learning from another language, incremental opt-in (just like all good refactorings, work on code and improve it piece by piece) — would have been the path that solved some genuine problems. Profiles seem a hodgepodge. And — to share personal worries about what I read into what comments like the above imply, this is not a statement — I worry deeply about the relationship between who proposes what, and who has pricklier personalities or less connections, with what approach was accepted.

I wish Safe C++ would continue as a hard fork of the language.

Voultapher 3 hours ago [-]
As much as the alternatives (profiles) don't solve the issue, Safe C++ (Circle) does have substantial issues as well. You need a separate and largely incompatible standard library, including containers. Generic code (templates) is largely left unsolved on a conceptual level so far. At this point incrementally replacing parts of your code with Rust - which has a mature ecosystem and tooling, remember if you want provable safety none of your dependencies used in Safe C++ code are allowed to be unsafe - is going to be less hassle. Firefox showed you can do it, and even Microsoft is choosing that path for the Windows kernel. Waiting for Safe C++ to be usable, seems like wanting to wait for a worse Rust. Interop is hardly going to be much better than bindings generated by cxx.
astrobe_ 3 hours ago [-]
> There is a prevalent culture of expecting users to not make mistakes.

I think the older of us C/C++ programmers come from no-safety languages like assembly language. That doesn't mean that all of us are "macho programmers" (as I was called here once). C's weak typing and compilers emitting warnings give a false sense of security which is tricky to deal with.

The statement you make is not entirely correct. The more correct statement is that there is a prevalent culture of expecting users to find strategies to avoid mistakes. We are engineers. We do what we need with what we have, and we did what we had to with what we had.

When you program with totally unsafe languages, you develop more strategies than just relying on a type checker and borrow checker: RAII, "crash early", TDD, completion-compatible naming conventions, even syntax highlighting (coloring octal numbers differently)...

BUT. the cultural characteristics of the programmers are only one-quarter of the story. The bigger part is about company culture, and more specifically the availability of programmers. You won't promote safer languages and safer practice by convincing programmers that it has zero impact on performance. It's the companies that you need to convince that the safe alternatives are as productive [1] as the less safe alternatives.

[1] https://xkcd.com/303/

Voultapher 3 hours ago [-]
I get the feeling you didn't watch my talk. The example in question is sorting. Say for example your comparison function does not implement a strict weak ordering, which can easily happen if you use <= instead of <, in C++ you routinely get out-of-bounds read and write, in Rust you get some unspecified element order.

In what world is the first preferable to the latter?

This behavior is purely an implementation choice. Even the C people glibc and LLVM libc consider this to be undesirable and are willing to spend 2-3% overhead on making sure you don't get that behavior.

No, this is not "expecting users to find strategies to avoid mistakes".

aw1621107 38 minutes ago [-]
> Even the C people glibc and LLVM libc consider this to be undesirable and are willing to spend 2-3% overhead on making sure you don't get that behavior.

libc++ actually had to roll back a std::sort improvement because it broke too much code that was relying on bad comparators. From the RFC for adding comparator checks to debug libc++ [0]:

> Not so long ago we proposed and changed std::sort algorithm [1]. However, it was rolled back in 16.0.1 because of failures within the broken comparators. That was even true for the previous implementation, however, the new one exposed the problems more often.

[0]: https://discourse.llvm.org/t/rfc-strict-weak-ordering-checks...

[1]: https://reviews.llvm.org/D122780 (not the original link, but I think this is the review for the changeset that was rolled back)

jcranmer 11 hours ago [-]
C and C++ are fundamentally memory-unsafe languages. That doesn't make them bad languages, but it is a reality that you have to face when you work with them. And one of the things we've learned is that building safe abstractions, while not a complete solution, does quite a long way.

And then CISA suggested that "maybe we should stop using memory-unsafe languages." And this has some of the C++ committee utterly terrified; they need something that lets them tell the government that C++, today, is memory-safe. That thing is C++ profiles. It's not about actually making C++ memory-safe, it's about being able to check the box that C++ is memory-safe, and this is so important it needs to be voted into the standards yesterday and why are you guys saying mean things about C++ profiles...

C++ profiles is a magic solution to the problem. As one committee member noted, there's not enough description of profiles yet to even figure out if it can be implemented or not. Instead, it's just a vague, well, compile with -fsanitize=address, compile with fortify-source, use hardened malloc, that makes my code memory-safe, right? And for as long as profiles remains a magic solution to check a box, it will remain vaporware in practice.

One of the real risks I see in the C++ committee is that they seem to want to drive all of the implementers out of the room.

pjmlp 4 hours ago [-]
Spot on, since C++11 the committee has increasingly started to design and add features to the standard without any kind of implementation, only after the standard gets ratified, the implementers eventually find out that the design is broken, or has flaws.

A strange phenomenon akin to the Algol 68 days, and no other ISO based language is taking, where standardising existing practice or having a full test implementation is still pretty much what it being done.

How many export templates, GC, type traits defect fixes, volatile behaviour changes, modules, contracts,.... can implementers still put up with?

murderfs 3 hours ago [-]
The committee got burned, extremely badly, by C++03's export templates, where it got standardized without an implementation, and everyone realized it was basically unimplementable, and the only group that did, wrote a paper telling everyone not to [1]

This was well in mind when C++11 came around (especially since C++11 slipped so badly: it was originally "C++0x", and then they ran out of digits for x). By the time we hit C++20, I think the lessons were lost, especially when it came to there being two competing modules implementations and the committee deciding to select neither.

1: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n14...

versteegen 1 hours ago [-]
Wow, that paper is absolutely damning.

> Design: 1.5 years (elapsed) to come up with a design they believed they could implement.

> Development: 3 person-years (3 people × >1 year each)

> (Note: By comparison, implementing the complete Java language from scratch took the same team 2 person-years.

pjmlp 5 hours ago [-]
As discussed multiple times, I agree with the sentiment.

I think we are reaching a phase where C++ won't be going away, as it is quite relevant in many fields, however the two languages approach will keep increasing, and many will consider C++26 good enough for such scenarios.

C++26 and not lower, due to reflection.

I am certain anything else beyond C++26 will only be considered by hardcore C++ shops that culturally won't ever use anything else, besides scripting for builds and OS automation tasks.

Animats 13 hours ago [-]
Regardless of the technology the big thing Rust has that C++ does not is safety culture, and that's dominant here.

True. So many proposals have gone by over the years. Here's one of mine from 2001.[1] Bad idea. The layers of cruft in C++ have become so deep that it's a career just to understand them.

DARPA has something called the TRACTOR program, "Translate All C to Rust". It's been underway for a year, and they have a consortium of universities working on it. Not much, if anything, has come out. Disappointing.

Rust is probably too hard. I write 100% safe Rust, and there are times when I hit an ownership structure wall and have to spend several days re-planning. So far I've always succeeded without using "unsafe" or indices, but it drags down productivity.

Although object-oriented programming is out of fashion, classes with inheritance are useful. It's really hard to do something comparable in Rust. Traits are not that helpful for this.

Go is a good compromise. Safety at a minor cost in performance. Go is good enough for web back end stuff. Go has both GC and "green threads". This automates the problems that wear people down in C++ and Rust.

[1] https://www.animats.com/papers/languages/cppstrictpointers.h...

zozbot234 10 hours ago [-]
> So far I've always succeeded without using "unsafe" or indices, but it drags down productivity.

There is a common perception that Rust is less productive than competing languages, but empirical research by Google and others has found this to be wrong. Rust just shifts the effort earlier in the development phase, where the costs are often orders of magnitude lower. You may spend a few hours struggling with the borrow checker, but that saves you countless days of debugging highly non-trivial defects, especially in a larger codebase.

> Although object-oriented programming is out of fashion, classes with inheritance are useful. It's really hard to do something comparable in Rust. Traits are not that helpful for this.

FWIW, "classes with inheritance" in Rust can be very elegantly modeled with generic typestate. (Traits are used as part of this pattern, but are not the full story.) It might look clunky at first glance, but it accurately reflects the underlying semantics.

lelanthran 5 hours ago [-]
> Rust just shifts the effort earlier in the development phase, where the costs are often orders of magnitude lower.

That works fantastically when you're rewriting something - you already have the idea and final product nailed down.

It works poorly when you don't have everything nailed down and might switch a lot of stuff around, or remove stuff that isn't needed, etc.

goku12 2 hours ago [-]
> It works poorly when you don't have everything nailed down and might switch a lot of stuff around, or remove stuff that isn't needed, etc.

I do prototype applications in Rust and it involves heavy refactoring including deletions. Those steps are the easiest ones for me and rarely gives me any headache. Part of the reason are the interfaces that you're forced to define clearly early on. Even the unrelated friction of satisfying the borrow checker gently nudge you towards that.

The real problems are often caused by certain operations that the type system can't prove to be safe, even when they are. For example, you couldn't write async closures until recently. Such situations often require lots of thought to resolve. You may have to restructure your code or use a workaround like RC variables.

The point is, these sorts of assumptions often don't seem to hold in practice, at least in my experience. My personal experience doesn't agree with the assertion that prototyping is hard in Rust.

zozbot234 4 hours ago [-]
> It works poorly when you don't have everything nailed down and might switch a lot of stuff around

If you're prototyping code you can just do defensive .clone() calls and use Rc<> to avoid borrow checker issues. You don't need maximum efficiency, and the added boilerplate doesn't hurt that much: in fact, it helps should you want to refactor the code later.

pjmlp 4 hours ago [-]
I dislike Go's minimalism, however it fits something I have been saying for years.

Many languages that predated Java and C#, already had everything that Go offers and then some.

Modula-3, Oberon, Oberon-2, Active Oberon, Component Pascal, Eiffel.

Had Java and C#, just like those, had full support for AOT compilation, value types and the same low level programming capabilities, and many stuff that was still written during 2000-2010 in C or C++ would not have happened, and maybe C++11 would not have been as relevant as it was.

During that decade many people kept writing C or C++, because they lacked mainstream alternatives for AOT compiled languages, and not because they were into low level systems programming.

vlovich123 12 hours ago [-]
> So far I've always succeeded without using "unsafe" or indices, but it drags down productivity.

I really don’t understand this perspective. The whole philosophy of Rust is one where you document why “unsafe” is safe. It is not and never has been a goal to make everything safe because that is an impossible goal to merge with high performance systems language because hardware itself is unsafe. It’s why the unsafe keyword exists. If that wasn’t the goal, unsafe wouldn’t.

usefulcat 10 hours ago [-]
If unsafe is not used, then no one has to determine whether the unsafe parts are actually safe.
vlovich123 8 hours ago [-]
Sure, but taken to an extreme you see the absurd degree you have to contort yourself. And that’s to the current version of the proof checker - some unsafe’s are even only temporary until a better prover comes by.

You shouldn’t go out of your way to use unsafe, but between that and 2 weeks refactoring, I’ll take the unsafe and use tools like miri or ASAN to provide extra guards. Engineering is inherently about making practical choices.

Animats 6 hours ago [-]
I just start everything with

    #![forbid(unsafe_code)]
cass0wary 10 hours ago [-]
TRACTOR is currently proceeding. The program is structured in phases. Each phase will present the participants with increasingly difficult challenges to translate. At the end of each phase the participants will be tested and the results of these tests will be publicly announced. The first phase of TRACTOR began in June and will run for six months.
cleartext412 11 hours ago [-]
Judging by this https://vt.social/@lina/113056457969145576 rewriting any even remotely complex project to rust will require making decisions (function signatures, ownership and so on) based on information that might not be present in the C code at all, like API conventions. Translator being able to decide on all these things automatically would probably be quite close to solving the halting problem.
Animats 7 hours ago [-]
Sigh. As I point out occasionally, the halting problem very rarely comes up in practice. Combinatorial explosion, yes, but not actual undecidability.

Understanding the implicit constraints of C/C++ functions is something where an LLM can help. Once you have the constraints recorded, they become formal constraints at the call. Historic C isn't expressive enough for even basic constraints.

Most of the constraints mentioned are at least expressible in Rust.

jandrewrogers 7 hours ago [-]
I think something less obvious to people is that type inheritance in C++ has several uses outside of building naive object hierarchies. Even if your model is based on composition as is typically the case these days, inheritance is a useful tool for expressing some metaprogramming mechanics and occasionally literal old style inheritance is actually the right thing to do. You don’t need it most of the time but sometimes not having it makes everything much uglier.

As of C++20 in particular, C++ has taken on a very traits-y character if you go all-in on the new language features.

The way out for C++ is probably to lean into compile-time codegen and verification within the language which is already a pretty unique capability. It dramatically reduces the lines of code a developer has to write. Defect rates closely track lines of code written regardless of the language so large improvements in compile-time expressiveness is a pretty big win.

pjmlp 4 hours ago [-]
Sadly we got concepts lite instead of C++0x concepts, so while better than SFINAE or tag dispatch, it is still half solution, and it won't get better because those behind it, eventually went on to Swift, and nowadays Hylo.
debo_ 12 hours ago [-]
I think there is room for an ML with a modern toolchain story that just omits Rust's borrow checker and does something more boring. Typescript and Rust have primed a large number of developers to be open to it.
throwawaymaths 11 hours ago [-]
this is kinda the opposite of what i think people really want. they want an LLL with borrow checking without all of the abstractions and baggage you get in rust.
int_19h 7 hours ago [-]
Most of those abstractions and baggage come from the need to be able to represent and propagate lifetime constraints, though.
throwawaymaths 5 hours ago [-]
proc macros? Optional?
zozbot234 10 hours ago [-]
Isn't ReasonML pretty much that language already? Although the most popular language in that broader niche is probably Golang.
efuquen 14 hours ago [-]
And I would say the deficiencies in Profiles and the fact Safe C++ was killed is the technical decisions reflecting the culture problem.
pizlonator 13 hours ago [-]
> The profiles technology isn't very good.

Can you be very specific about why?

Here's the argument for why profiles might work: with all of the profiles enabled, you are only allowed to use the safe subset of C++ and all of the unsafe stuff is hidden behind APIs whose implementations don't have those profiles enabled. Those projects that enable all profiles by default effectively get Swift-like or Rust-like protection.

Like, you could force all array operations to use C++ stdlib primitives, enable full hardening of the stdlib, and then have bounds safety.

And you could force all lifetime operations to use C++ stdlib refcounting primitives, and then have lifetime safety in a Swift-like way (i.e. eager refcounting everywhere).

I can imagine how this falls over but then it might just be a matter of engineering to make it not fall over.

(I'm playing devils advocate a bit since I prefer Fil-C++.)

tialaramex 11 hours ago [-]
Yes I can be specific.

Firstly, you need composition. Rust's safety composes. The safe Rust library for farm animals from Geoff, the safe Rust library for cooking recipes by Alice and the safe Rust library for web server by Bert together with my safe program code adds up to my safe Rust farm foods web site.

By having N profiles, where N is intended to be at least five and might grow arbitrarily and be user extensible, C++ guarantees it cannot deliver composition this way.

Maybe they can define some sort of composition and maybe everybody will ship software which conforms to that definition and so eventually they get composition, that's not there today, so it's just a giant unknown at best.

Secondly, of the profiles described so far, most of them are just solving parts of the single overarching problem Rust addresses, for the serial case. So if they ship that, which already involves some amount of new work yet to be finished, you need all of those profiles to get to only partial memory safety.

Which comes to the third part. Once you start down this path, as they found, you realise you actually want a borrowck. You won't call it that of course, because that would be embarrassing. But you'll need to track reference lifetimes and you'll need annotation and you end up building most of the stuff you insisted you didn't want. For now, you can handwave, this is an unsolved static analysis problem. Well, not so much unsolved as you know the solution and you don't like it.

Your idea to do the reference counting everywhere is not something WG21 has looked at, I think the perf cost is sufficiently bad that they won't even glance at it. They're also not going to ship a GC.

Finally though, C++ is a concurrent language. It has a whole memory model which doesn't even make sense if you aren't thinking about concurrency. But to deliver concurrent memory safety without Fil-C's overheads you would want... well, Rust's Send and Sync traits, which sure enough have eerie twins in the Safe C++ proposal. No attempt to solve this is even hinted at in the current profiles proposal, and they would need to work one out and if it's not Send + Sync again they'd need to prove it is correct.

silon42 4 hours ago [-]
+1 ... Rust has done pretty much the minimal thing that one needs to write C/C++ like programs safely... things must fit together to cover all scenarios (borrow checker / mut / send / sync / bounds checking). Especially for multithreading.

C++ / profiles will not be able to do much less or much different to achieve the same goals.

pizlonator 10 hours ago [-]
I think the point is that folks will incrementally move their code towards having all profiles enabled, and that's sort of fundamental if the goal is to give folks with C++ codebases an incremental path to safety. So I don't buy your first and second points.

> Which comes to the third part. Once you start down this path, as they found, you realise you actually want a borrowck.

That's a bold statement. It might be true for some very loose definition of "borrow checker". See the super simple static analysis that WebKit uses (that presentation is now linked in at least two places on this HN discussion, so I won't link it again).

> Your idea to do the reference counting everywhere is not something WG21 has looked at, I think the perf cost is sufficiently bad that they won't even glance at it. They're also not going to ship a GC.

The point isn't to have ref counting on every pointer at the language level, but rather: if your prevent folks from calling `delete` directly (as one of the profiles does) then you're effectively forcing folks to use smart pointers.

Reference counting that happens by smart pointers is something that they would ship. We know this because it's already happened.

I imagine this would really mean that some references are ref counted (if you use shared_ptr or similar) while other references use some other policy.

> Finally though, C++ is a concurrent language. It has a whole memory model which doesn't even make sense if you aren't thinking about concurrency. But to deliver concurrent memory safety without Fil-C's overheads you would want... well, Rust's Send and Sync traits

Yeah, this might be an area where they leave a hole. Like, you might have reference counting that is only partially thread safe:

- The refcount of any object is atomic.

- The smart pointer itself is racy. So, racing on pointers can pop the protections.

If they got that far, then that wouldn't be so bad. The marginal safety advantage of Rust would be very slim at that point.

pjmlp 4 hours ago [-]
> I think the point is that folks will incrementally move their code towards having all profiles enabled, and that's sort of fundamental if the goal is to give folks with C++ codebases an incremental path to safety.

I doubt it, because the reason I favoured C++ over C back in 1993, was the safety culture, as someone coming from Turbo Pascal.

Somehow this has been deteriorating since 2000, as C++ kept getting C refugees that would rather keep using C, but work required C++ now.

Most of the hardening capabilities that are being added now, were already part of the compiler frameworks during the 1990's, e.g. Turbo Vision, OWL, MFC, CSet++, MFC, MacApp, PowerPlant,...

Rusky 8 hours ago [-]
If that is what profiles were actually doing, it would probably make sense. But it's not what profiles are doing.

Instead, for example, the lifetime safety profile (https://github.com/isocpp/CppCoreGuidelines/blob/master/docs...) is a Rust-like compile time borrow checker that relies on annotations like [[clang::lifetimebound]], yet they also repeatedly insist that profiles will not require this kind of annotation (see the papers linked from https://www.circle-lang.org/draft-profiles.html#abstract).

Their messaging is just not consistent with the concrete proposals they have described, let alone actually implemented.

pjmlp 4 hours ago [-]
Additionally they ignore field experience, I can tell that on VC++ the lifetime checker only has worked in small examples, as I was really into trying it out.

Microsoft even has blog posts admitting that only with SAL like annotations it can be improved, while keeping the usual C++ semantics.

Yet WG21 has ignored this field experience.

dminik 3 hours ago [-]
As far as I'm concerned, there are two main issues with profiles:

1. They're either unimplementable or useless (too many false positives and false negatives).

I think this is pretty evident based on the fact that profiles have been proposed for a while and that no real implementation exists. Worse, out of all of the open source projects and for profit companies, noone has been able to implement any sort of static analysis that would even begin to approach the guarantees Rust makes.

2. The language doesn't give you any tools to actually write safe code.

Ok, let's say that someone actually implements safety profiles. And it highlights your usage of a standard library type. What do you do?

Safe C++ didn't require a new standard library just because. The current stdlib is riddled with safety issues that can't really be fixed and would not be fixed because of backwards compatibility.

You're stuck. And so you turn the safety profile off.

steveklabnik 11 hours ago [-]
> with all of the profiles enabled, you are only allowed to use the safe subset of C++ and all of the unsafe stuff is hidden behind APIs whose implementations don't have those profiles enabled.

This is not the goal of profiles. It’s to be “good enough.” Guaranteed safety isn’t in the cards.

pizlonator 11 hours ago [-]
> This is not the goal of profiles. It’s to be “good enough.” Guaranteed safety isn’t in the cards.

- Rust isn’t totally guaranteed safe since folks can and do use unsafe code.

- Exact same situation in Swift

- Go has escape hatches like it you race, but not only.

So most “safe” things are really “safe enough” for some definition of “enough”.

steveklabnik 11 hours ago [-]
You’re misunderstanding what I’m saying. Safe Rust guarantees memory safety. Profiles do not. This is regardless of the ability of the unchecked versions, on both sides, to introduce issues.

Profiles do not, even for code that is 100% using profiles, guarantee safety.

pizlonator 10 hours ago [-]
The kind of "safe Rust" where you never use `unsafe` and never call into a C library is theoretical. None of the major ports of software to Rust achieve that.

So, no matter what safe language we talk about, "safety" always has its caveats.

Can you be specific about what missing safety feature of profiles leads you to be so negative about them?

steveklabnik 10 hours ago [-]
No, I am saying that safe rust says “if unsafe is correct, safe rust means memory safety.” Profiles does not even reach that bar, it says “code under profiles is safer.”

It’s not about specifics, it’s about the stated goals of profiles. They do not claim to prove memory safety even with all of them turned on.

dwattttt 7 hours ago [-]
> The kind of "safe Rust" where you never use `unsafe` and never call into a C library is theoretical. None of the major ports of software to Rust achieve that.

An entire program ported to Rust will call into unsafe APIs in at least a few places, somewhere down the call stacks.

But you'll still have swathes of code that doesn't ultimately end up calling an unsafe API, which can be trivially considered memory safe.

AlotOfReading 11 hours ago [-]
The language standard assumes that everyone collectively agrees to standard semantics implying certain things. If users don't follow the rules and write something without semantics (undefined behavior), the entire program is meaningless as opposed to just the bit around the violation. You know this, so I emphasize it here because it's entirely incompatible with the view that "good enough" is a meaningful concept to discuss from the PoV of the standard.

Rust does a pretty good job formalizing what the safety guarantees are and when you can assume them. Other languages don't, but they also don't support safety concepts that C++ nominally does like safety critical systems. "Good enough" can be perfectly fine for a web service like Go while being grossly inadequate for HPC or safety critical.

IAmLiterallyAB 13 hours ago [-]
My limited understanding is. There is no safe subset (That's what was just discontinued, profiles are the alternative.)

And C++ code simply doesn't have the necessary info to make safety decisions. Sean explains it better than I can https://www.circle-lang.org/draft-profiles.html

jmull 8 hours ago [-]
The analysis you link to is insufficient.

E.g., the first case is "Inferring aliasing". He presents some examples and states, "The compiler cannot infer a function’s aliasing requirements from its declaration or even from its definition."

But why not?

The aliasing requirements come directly from vector. If the compiler has those then determining the aliasing requirements of those functions is straightforward.

Now, maybe there is some argument that a C++ compiler cannot determine the aliasing requirements of vector, but if that's the claim, then the paper should make it, and back it up.

The paper continues in the same vein in the next section, as if the lifetime requirements of map and min cannot be known or cannot bubble up through the functions that call them.

As written, the paper says almost nothing about the feasibility of static analysis of C++ to achieve safety goals for C++.

dwattttt 7 hours ago [-]
I imagine it's (implicitly?) referring to avoiding whole-of-program analysis.

For example, given a declaration

  int* func(int* a);
What's the relationship between the return value and the input? You can't know without diving into 'func' itself; they could be the same pointer or it could return a freshly allocated pointer, without getting into the even more esoteric options.

Trying to solve this without recursively analysing a whole program at once is infeasible.

Rust's approach was to require more information to be provided by function definitions, but that's new syntax, and not backwards compatible, so not a palatable option for C++.

coffeeaddict1 12 hours ago [-]
> And you could force all lifetime operations to use C++ stdlib refcounting primitives, and then have lifetime safety in a Swift-like way (i.e. eager refcounting everywhere)

That's going to be a non-starter for 99% of serious C++ projects there. The performance hit is going to be way too large.

For bounds checking, sure I think the performance penalty is so small that it can be done.

gmueckl 10 hours ago [-]
You have to realize that the number of locations in code where reference counter adjustment is actually meaningful is rather small and there are simple rules to keep the excess thrash from reference counting pointer wrappers to a minimum. The main one, as mentioned in the talk the sibling comment called out, is that it is OK to pass a raw pointer or reference to a function while holding on to a reference count for as long as that other function runs (and doesn't leak the pointer through a side effect). This rule catches a lot of pointless counter arithmetic through excessive pointer wrapper copying.
silon42 4 hours ago [-]
Maybe C++ should copy some Swift, before attempting to challenge Rust.
pjmlp 4 hours ago [-]
It has already multiple times,

Managed C++, C++/CLI, C++/CX, C++ Builder, Unreal C++

But those aren't extensions or approaches WG21 cares about having.

The C++11 GC design didn't even took those experiences into consideration, thus it got zero adoption, and was dropped on C++20.

pizlonator 12 hours ago [-]
That would have been my first guess but WebKit's experience doing exactly this is the opposite.

See https://www.youtube.com/watch?v=RLw13wLM5Ko

Note that they also allowed other kinds of pointers so long as their use could be statically verified using very simple rules.

TuxSH 9 hours ago [-]
> For bounds checking, sure I think the performance penalty is so small that it can be done.

Depends on how many times it's inlined and/or if it's in hot code. It can result in much worse assembly code.

Funny thing: C++17 string_view::substr has bound check + exception throw, whereas span::subspan has neither; I can see substr's approach being problematic performance- and code-size-wise if called many times yet being validated by caller.

AlotOfReading 12 hours ago [-]
There'd be less opposition if profiles worked that way. The real goal is to define a subset that excludes 95% of the unsafe stuff, as opposed to providing hard guarantees.
Henrystewart 14 hours ago [-]
[dead]
nechuchelo 4 hours ago [-]
I do want C++ to be a safer language, but I don't think inheriting the Rust safety model is the way to go. It is in a way revolutionary but has major downsides like inability to deal with cyclic data structures without clumpsy workarounds.

I don't want to play with a plastic sword, just put it in a sheath.

LorenDB 15 hours ago [-]
The title has potential to be a bit misleading, because as the article says, while Sean Baxter's proposal is not being continued, the committee is working on the Profiles proposal, which still will enable some level of safety. So C++ is still working towards safety, just not the Safe C++ safety.
loeg 15 hours ago [-]
Seems clear enough to me. The "Safe C++" proposal is not being continued. Profiles is not what was proposed in "Safe C++."
TinkersW 15 hours ago [-]
Hardware level safety will arrive first(see Apple support for Memory Integrity Enforcement ), not as fool proof as SafeC++/Rust but no need to change code...
pjmlp 4 hours ago [-]
It has been already here since 2015 on Solaris SPARC.
TimorousBestie 15 hours ago [-]
C++ is working towards safety with the same enthusiasm with which I tackle AI-generated merge requests.
coffeeaddict1 15 hours ago [-]
The safety story with Profiles is rather basic (almost laughable honestly) and hardly any improvement over what was already achievable with compiler flags and clang-tidy.
the_black_hand 9 hours ago [-]
"If you want to write like Rust, just write Rust". Exactly. No reason not to. RIP cpp
heresie-dabord 8 hours ago [-]
> "If you want to write like Rust, just write Rust".

And if you want to write like D...

juliangmp 2 hours ago [-]
I'm unsure a bout the profiles. If they add restrictions and need to be enabled as a compiler flag, no legacy project I'll use them since they'd probably get like 4 errors and say "oh this options breaks my code, but the code has been running for years so its fine".
vdfs 1 hours ago [-]
Neither will those projects use Safe C++
Panzerschrek 5 hours ago [-]
The mentioned proposal isn't really that great. It basically tries to make C++ to Rust by blindly copying many its ideas. Many of them aren't strictly necessary to achieve safety in C++. There are different proposals for safety, which are way smaller and simpler than this one.
usamoi 11 hours ago [-]
They are not rejecting Safe C++; they are rejecting memory safety. Majority of them believes that memory safety is just hype, and minority of them knows it's a problem, but doesn't want to restrict themselves about coding. If code runs, it is fine. If it does not, coder running is fine too.
steveklabnik 11 hours ago [-]
The principles document that was accepted feels very targeted at Safe C++ specifically. It’s fair to say they rejected it.
MattDamonSpace 7 hours ago [-]
I work on a Swift/iOS app that wraps a C++ library

90+% of our crashes are from hard-to-diagnose cpp crashes. Our engineers are smart and hardworking but they throw their hands up at this.

Please tell me my options aren’t limited to “please be better at programming”…?

usamoi 3 hours ago [-]
> Our engineers are smart and hardworking but they throw their hands up at this.

Since you don't think this is a skill issue, shouldn't you support Safe C++, which eliminates unsafety rather than just turning a blind eye to it?

> Please tell me my options aren’t limited to “please be better at programming”…?

You can only use Valgrind/ASan, stress testing, and rewriting in other languages to pay off the technical debt. Even if a god points out every bug in your code, you'd still need to put in great effort to fix them. If you don't pay for it while coding, then you must pay for it after coding. There are no shortcuts.

dagmx 4 hours ago [-]
Have you tried enabling asan? It’s not really the same kind of language guarantees but it does catch a lot of the same errors.

In general I think static analysis is a crutch for C++ to claim safety but it is still a very useful tool that devs should use in development.

lallysingh 6 hours ago [-]
Call me stupid for asking, but what is "safe" here? I get the length-checked buffer copies and accesses, is there anything else? Less allowed type conversions?
MForster 5 hours ago [-]
You are talking about spatial safety. There are a few other types of memory safety:

- temporal safety (e.g. no use after free) - initialization safety (no read of initialized memory) - thread safety (no data races) - type safety (accessing memory with the correct type)

Davidbrcz 2 hours ago [-]
You can't convince me that 'safe c++' is not an oxymoron.
quotemstr 15 hours ago [-]
Okay, so treat the C++ standards committee the same way the HTML5 people treated W3C. If they insist on making themselves irrelevant, let them.

Profiles cannot achieve the same level of safety as Rust and it's obvious to anyone who breathes. Profiles just delete stuff from the language. Without lifetimes reified as types you can't express semantics with precision enough to check them. The moment string_view appears, you're horked.

Okay, so you ban all uncounted reference types too. Now what you're left with isn't shit Rust but instead shit Swift, one that combines the performance of a turtle with the ergonomics of a porcupine.

There's no value in making things a little bit safer here and there. The purpose of a type system is to embed proofs about invariants. If your system doesn't actually prove the the invariant, you can't rely on it and you've made is a shitty linter.

Continue the safe C++ work outside the context of the C++ standards committee. Its members, if you ignore their words and focus on the behaviors, would rather see the language become irrelevant than change decades old practices. Typical iron law of bureaucracy territory.

int_19h 15 hours ago [-]
The reason why WHATWG was able to take over HTML like that is because all the people & companies that were actually making the browsers were onboard.

With C++, my impression is that most implementers simply aren't interested. And, conversely, most people who might be interested enough to roll a new implementation have already moved to Rust and make better use of their time improving that.

favorited 14 hours ago [-]
The browser companies weren't just onboard with WHATWG – they literally are WHATWG. The WHATWG steering committee is Apple, Google, Microsoft, and Mozilla.
pjmlp 4 hours ago [-]
You right, because the main companies with OSes that contribute to the surviving compilers that still care about ISO, are now focused elsewhere.

Apple with Swift, Google with Go, Java/Kotlin, Rust, Microsoft with C#, Go, Java and Rust.

Most modules stuff on GCC was done by a single developer, if I am not mistaken.

Notice how Visual C++ blog posts are mostly about game development related improvements, Visual Studio tooling.

Everyone else their compilers are still stuck on C++14, or C++17 if lucky.

quotemstr 14 hours ago [-]
You're right, but dammit, I wish you weren't. A world in which we can evolve existing large C++ codebases gradually towards safety instead of having to RRiR is a better world.

There are lots of cool innovations C++ made that will just disappear from the Earth, forever, if C++ can't be made memory safe in the same sense Rust is. I mean, Rust doesn't even support template specialization.

I don't think it's too late for someone to fork both C++ and Clang and make something that's actually a good synthesis of the old and the new.

But yeah, the most likely future is one on which C++ goes the way of Fortran (which still gets regular updates, however irrelevant) and the energy goes into Rust. But I like to rage, rage, against the dying of the type based metaprogramming.

BoxFour 14 hours ago [-]
> I don't think it's too late for someone to fork both C++ and Clang and make something that's actually a good synthesis of the old and the new.

People have tried variants of this already: Carbon, for example. I don’t think anyone outside of Google uses it, though, and even within Google I suspect it’s dwarfed by regular C++.

I don’t think C++ will become irrelevant for a long time. Recent standards have added some cool new features (like std::expected), and personally I feel like the language is better than ever (a biased opinion obviously).

Memory management is still a huge elephant in the room, but I don’t think it’s becoming irrelevant.

pjmlp 4 hours ago [-]
Carbon is still being designed, and their goal is to migrate existing code, for Google purposes.

They are quite open everyone else should use a managed language or Rust.

Yoric 14 hours ago [-]
FWIW, Rust doesn't have specialization yet because they're really hard to get right without introducing new undefined behaviors.

This doesn't mean that it's not possible to achieve a safe subset of C++ that supports template specialization, but it suggests that we aren't going to see it any time soon.

tialaramex 14 hours ago [-]
Well like you said, Fortran didn't actually go anywhere. Fortan 77 is a terrible programming language, but you can't seriously claim it "disappeared from the Earth".

Not that long ago tsoding was like "I should learn Fortran" and wrote a bunch of Fortran. Obviously from his perspective some things about Fortran are awful because it's very old, but it wasn't somehow impossible to do.

There are a few really amazing things which have been achieved in C++ like fmt, a compile time checked, userspace library for arbitrarily formatting variadic generic parameters. That's like man on the moon stuff, genuinely impressive. Mostly though C++ is a garbage fire and so while it's important to learn about it and from it we should not keep doing that.

Yoric 14 hours ago [-]
> There are a few really amazing things which have been achieved in C++ like fmt, a compile time checked, userspace library for arbitrarily formatting variadic generic parameters.

Anecdotal, but that's hardly unique to C++. So even if C++ were to disappear overnight (which we all agree won't happen), this wouldn't be a burning-library-of-Alexandria moment.

tialaramex 14 hours ago [-]
Well. What other examples of this feat are you thinking of?

To me the things which come to mind are either compiler magic (e.g. C printf) or they rely on RTTI (e.g. Odin and similar C-like languages) and neither of those is what fmt does, they're "cheating" in some sense that actually matters.

Yoric 3 hours ago [-]
I co-implemented this in OCaml ~20 years ago, using no compiler magic (it did use a macro). Pretty sure that I've seen the equivalent in Haskell, too, and in variants of Scheme. Zig has it, too. And while, as you mention, the Rust implementation does use compiler magic, I'm pretty sure that it could be implemented as a macro, just more slowly.

etc.

int_19h 6 hours ago [-]
https://doc.rust-lang.org/std/macro.format.html

https://zig.guide/standard-library/formatting/

notmywalrus 4 hours ago [-]
Rust is an example of "compiler magic" in this case.

You're right about Zig, and reading the source [1] I'm still kind of impressed how unified they made comptime / runtime.

[1]: https://github.com/ziglang/zig/blob/32a1aabff78234b428234189...

pklausler 13 hours ago [-]
Judging Fortran by looking at Fortran 77 is preposterously uninformative.
worthless-trash 9 hours ago [-]
What should we be looking at ?
slavik81 8 hours ago [-]
If we're judging the state of Fortran in 2025, we should probably be discussing Fortran 2018 or Fortran 2023.
fooker 13 hours ago [-]
> fork both C++ and Clang

Great! What are you waiting for?

If you try to answer that question, you'll also find why other similar projects are not finding much traction yet.

safercplusplus 13 hours ago [-]
> Profiles cannot achieve the same level of safety as Rust

So the claim is that the scpptool approach[1] can, while remaining closer to traditional C++, and not requiring the introduction of new language elements. Since the scpptool-enforced safe subset of C++ is an actual subset of C++, conforming code continues to build with your existing compiler. It just uses an additional static analyzer to check conformance.

For the 90% or whatever of C++ code that is not actually performance sensitive, the associated SaferCPlusPlus library provides drop-in and "one-to-one" safe replacements for unsafe C++ elements (like standard library containers and raw pointers). (For example, if you're worried about potentially invalid vector iterators, you can just replace your std::vector<>s with mse::mstd::vector<>s.) With these elements, most of the safety is enforced in the type system and not reliant on the static analyzer.

Conforming implementations of performance-sensitive code would be more restricted and more reliant on the static analyzer for safety enforcement. And sometimes requires the use of library elements, like "borrowing objects", which may not have analogies in traditional C++. But overall, even high-performance conforming code remains very recognizable C++.

The claim is that the scpptool approach is a straightforward path to full memory (and data race) safety for C++, and the one that requires the least code migration effort. (And again, as an actual subset of existing C++, not technically dependent on standard committees or compiler vendors for its implementation or deployment.)

[1]: https://github.com/duneroadrunner/scpptool/blob/master/appro...

tiberius_p 15 hours ago [-]
I'm not up to date with the latest developments in C++ but would't it be straightforward to do something like "#pragma pointer_safety strong" which would force the compiler to only accept the use of smart pointers or something along those lines. Was anything like this proposed so far?
favorited 14 hours ago [-]
You might be interested in this talk[0] by a WebKit engineer on how they're implementing similar approaches using libTooling and their own smart pointer types.

For example, their tooling prevents code like this:

    if (m_weakMember) { m_weakMember->doThing(); }
from compiling, forcing you to explicitly create an owning local reference, like so:

    if (RefPtr strongLocal = m_weakMember.get()) { strongLocal->doThing(); }
unless it's a trivial inlined function, like a simple getter.

[0]https://www.youtube.com/watch?v=RLw13wLM5Ko

pizlonator 13 hours ago [-]
I was going to link to this.

My interpretation of Geoff's presentation is that some version of profiles might work, at least in the sense of making it possible to write C++ code that is substantially safer than what we have today.

tialaramex 10 hours ago [-]
Geoff's stuff is mostly about heuristics. For his purpose that makes sense. If Apple are spending say $1Bn on security problems and Geoff spends $1M cutting such problems by 90% that's money well spent. The current direction of profiles is big on heuristics. Easy quick wins to maybe get C++ under the "Radioactively unsafe" radar even if it can't pass for safe.

The most hopeful thing I saw in Geoff's talk was cultural. It sounds like Geoff's team wanted to get to safer code. Things went faster than expected, people landed "me too" unsolicited patches, that sort of thing. Of course this is self-reported, but assuming Geoff wasn't showing us a very flattering portrait of a grim reality, which I can't see any incentive for, this team sounds like while they'd get value from Rust they're delivering many of the same security benefits in C++ anyway.

Bureaucrats don't like culture because it's hard to measure. "Make sure you hire programmers with a good culture" is hard to chart. You're probably going to end up running some awful quiz your team hates "Answer D, A, B, B, E, B to get 100%". Whereas "Use Rust not C++" is measurable, team A has 93% of code in Rust, but team B scored 94.5% so that's more Rust, they win.

pizlonator 9 hours ago [-]
> Geoff's stuff is mostly about heuristics.

That's not true at all.

- The bounds safety part of it prevents those C operations that Fil-C or something like it would dynamically check. You can to use hardened API instead.

- The cast safety part of it prevents C casts except if they're obviously safe.

- The lifetime safety part of it forces you to use WebKit's smart pointers except when you have an overlooking root.

Those are type safety rules. It's disingenuous to call them heuristics.

It is true, however, that Geoff's rules don't go to 100% because:

- There are nasty corners of C that aren't covered by any of those rules.

- WebKit still has <10% ish code that isn't opted into those rules.

- WebKit has JITs.

tialaramex 1 hours ago [-]
I can't rationalize how "prevents... except" isn't still just heuristics.

r/cpp is full of people with such heuristics, ways that they personally have fewer safety bugs in their software. That's how C++ got its "core guidelines", and it is clearly the foundation of Herb's profiles. You can't get to safety this way, you can get closer than you were in a typical C++ codebase and for Geoff that was important.

recursivecaveat 14 hours ago [-]
I don't think that really accomplishes anything. If you interpret it broadly enough to meaningfully improve safety you have to ban so much stuff that no codebase will ever turn it on. It's a pretty straightforward locally-verifiable property as well, so people who really want it don't need a pragma to enforce it.
randomNumber7 14 hours ago [-]
The problem with this would probably be that you usually have to use some libraries with C APIs and regular pointers.

You could compile your program with address sanitizer then it at least crashes in a defined way at runtime when memory corruption would happen. TCC (tiny C compiler initially written by fabrice bellard) also has such a feature I think.

This of course makes it significantly slower.

BoxFour 14 hours ago [-]
> pragma pointer_safety strong" which would force the compiler to only accept the use of smart pointers

You’d possibly just be trading one problem for another though - ask anyone who’s had to debug a shared ownership issue.

quotemstr 14 hours ago [-]
That's what the "Profiles" feature is. The problem is that any nontrivial real world program in a non-GC language needs non-owning reference types to perform well, and you can't express the rules for safe use of non-owning references without augmenting the language. People have tried. You need something more sophisticated than using smart pointers for everything. In the limit, smart pointers for everything is just called "Python".

What infuriates me about the C++ safety situation is that C++ is by and large a better, more expressive language than Rust is, particularly with respect to compile time type level metaprogramming. And I am being walked hands handcuffed behind my back, alongside everyone else, into the Rust world with its comparatively anemic proc macro shit because the C++ committee can't be bothered to care about memory safety.

Because of the C++ standards committee's misfeasance, I'm going to have to live in a world where I don't get to use some of my favorite programming techniques.

int_19h 6 hours ago [-]
> You need something more sophisticated than using smart pointers for everything. In the limit, smart pointers for everything is just called "Python".

I don't see how that follows at all. What makes Python Python (and slow!) is dynamic dispatch everywhere down to the most primitive things. Refcounted smart pointers are a very minor thing in the big picture, which is why we've seen Python implementations without them (Jython, IronPython). Performance-wise, yes, refcounting certainly isn't cheap, but you just do that and keep everything else C++-like, the overall performance profile of such a language is still much closer to C++ than to something like Python.

You can also have refcounting + something like `ref` types in modern C# (which are essentially restricted-lifetime zero-overhead pointers with inferred or very simplistic lifetimes):

https://learn.microsoft.com/en-us/dotnet/csharp/language-ref...

It doesn't cover all the cases that a full-fledged borrow checked with explicit lifetime annotations can, but it does cover quite a few; perhaps enough to adopt the position that refcounting is "good enough" for the rest.

Yoric 14 hours ago [-]
> In the limit, smart pointers for everything is just called "Python".

To be more precise, it's old Python. Recent versions of Python use a gc.

> And I am being walked hands handcuffed behind my back, alongside everyone else, into the Rust world with its comparatively anemic proc macro shit because the C++ committee can't be bothered to care about memory safety.

Out of curiosity (as someone working on static analysis), what properties would you like your compiler to check?

quotemstr 14 hours ago [-]
I've been thinking for a while now about using dependant typing to enforce good numerics in numerics kernels. Wouldn't it be nice if we could propagate value bounds and make catastrophic cancellation a type error?

Have you worked much with SAL and MIDL from Microsoft? Using SAL (an aesthetically hideous but conceptually beautiful macro based gradual typing system for C and C++) overlay guarantees about not only reference safety but also sign comparison restriction, maximum buffer sizes, and so on.

zevets 14 hours ago [-]
Please do this.

But first: we need to take step-zero and introduce a type "r64": a "f64" that is not nan/inf.

Rust has its uint-thats-not-zero - why not the same for floating point numbers??

tialaramex 14 hours ago [-]
You can write your "r64" type today. You would need a perma-unstable compiler-only feature to give your type a huge niche where the missing bit patterns would go, but otherwise there's no problem that I can see, so if you don't care about the niche it's just another crate - there is something similar called noisy_float
zevets 13 hours ago [-]
I can do it, and I do similar such things in C++ - but the biggest benefit of "safe defaults" is the standardization of such behaviors, and the resultant expectations/ecosystem.
1718627440 13 hours ago [-]
> Rust has its uint-thats-not-zero

Why do we need to single out a specific value. It would be way better if we also could use uint-without-5-and-42. What I would wish for is type attributes that really belong to the type.

    typedef unsigned int __attribute__ ((constraint (X != 5 && X != 42))) my_type;
int_19h 6 hours ago [-]
Proper union types would get you there. If you have them, then each specific integer constant is basically its own type, and e.g. uint8 is just (0|1|2|...|255). So long as your type algebra has an operator that excludes one of the variants from the union to produce a new one, it's trivial to exclude whatever, and it's still easy for the compiler to reason about such types and to provide syntactic sugar for them like 0..255 etc.
steveklabnik 13 hours ago [-]
Those are the unstable attributes that your sibling is talking about.
1718627440 13 hours ago [-]
Yeah of course I can put what I want in my toy compiler. My statement was about standard C. I think that's what Contracts really are and hope this will be included in C.
steveklabnik 11 hours ago [-]
Oh sure, I wouldn’t call rustc a “toy compiler” but yeah, they’d be cool in C as well.
strus 15 hours ago [-]
I don’t think anyone is surprised.
westurner 8 hours ago [-]
Rust then?

From "The state of Rust trying to catch up with Ada [video]" https://news.ycombinator.com/item?id=43007013 :

> [awesome-safety-critical]

> rustfoundation/safety-critical-rust-consortium: https://github.com/rustfoundation/safety-critical-rust-conso...

rust-lang/fls: https://github.com/rust-lang/fls

How does what FLS enables compare to these Safe C++ proposals?

Safe C++ draft: https://safecpp.org/draft.html

flykespice 13 hours ago [-]
C++ will never be safe as long as its C root persists, it doesn't matters how much freatures you add on top of C++ to make writing safe programs more convenient.

You need to take off the "inherently unsafe" C root from C++, but it wouldn't be called C++ anymore by that point.

winrid 12 hours ago [-]
C+++ :)
ImTotallyVegan 10 hours ago [-]
C+-