HI HACKER NEWS! Excited to see you!!! Ping me if you find typos.
(Also jeez, writing this took way too long and I spent too much time editing it trying to cram everything from Cpp2, the Google governance issue, member access operators as a case study, some historical bits, etc. into the post.)
EDIT: One of the most interesting things for me is that (so far) no one complained about the lack of Carbon code examples.
kayvr 35 days ago [-]
FWIW, I thought the bits on governance and history provided much needed context. Great post.
Mond_ 35 days ago [-]
Thanks a lot! When writing something like this it can be hard to keep in mind what your average reader is familiar with.
"Do I need to get into the history and the structural process of the C++ Standard Committee? I'm sure everyone already knows about that, right?"
piebro 35 days ago [-]
Thanks for the post, I enjoyed reading it.
There is a small typo in this sentence: "As long as we’re willing to say that Carbon is is about reducing the reliance on the C++ Standard Committee ...". There are two "is".
PaulDavisThe1st 34 days ago [-]
It's a hangover from the ill-fated Clinton programming language, where you repeat an operator (such as "is") to ensure there is no ambiguity about what your intention is.
zelphirkalt 34 days ago [-]
Did you mean to write "there is is no ambiguity ..."?
mattigames 34 days ago [-]
Too early for that pun, it works better at the end as a punchline of sorts, "ensure there is not ambiguity about what your intention is is."
funkychicken 34 days ago [-]
You mentioned that std::unique_ptr had a runtime cost (versus a raw pointer). I had always assumed the compiler would optimize any of that away (if you have the default deleter). Could you explain what you meant?
Oh thanks! That’s really depressing though, I feel like my whole world view has been shattered lmao
scrubs 34 days ago [-]
Technically kind of interesting ... but as a writing piece it's a mess of mushy, roundabout, haphazard points. With every line I read I can just hear my uni English prof in my ear yelling.
Mond_ 34 days ago [-]
I can live with that, I think. I'd like to believe that all of the points are relevant.
The issue is that I had to cram them all in somehow, without making the article excessively long. I'm open to criticism, though!
adrian_b 35 days ago [-]
I have no idea whether Carbon will be successful, but this is the only right way of evolving a programming language when incompatible changes must be made: by providing tools that guarantee a completely automatic migration of the legacy programs.
Hopefully Carbon will succeed to achieve this goal.
eej71 34 days ago [-]
I think the idea of "when incompatible changes must me made" has caused much harm and damage to various language projects. I'm specifically thinking of the damage done as part of the python 2->3 changes and to a lesser extent the damage done with the original conception of what was called perl6.
C++ is definitely in a tough spot on the evolutionary trail. But the idea that the only path ahead is through incompatible changes seems likely to produce the same harmful effects.
tialaramex 34 days ago [-]
In 2019 Epochs was proposed for C++. That's "too late" for C++ 20, but only by convention. Epochs proposed to give C++ a way to evolve while retaining better backward compatibility, much like Rust's Editions (the 2024 Edition stabilized last year and will ship in Rust 1.85 in a week or two)
Instead C++ has added all sorts of slightly incompatible features every three years since 2011, and is expected to do so again, periodically one of these incompatible changes is especially troublesome for important people and, like a naughty toddler, the committee promises not to do that again.
Yet, despite these incompatible changes which might have accidentally larger consequences than expected, for fear of the consequences other changes which were known to be incompatible but seem "worth it" are rejected. The worst of both worlds.
pjmlp 34 days ago [-]
In practice, epochs offer little more than -std=lang-version, because they only cover grammar changes, and a few selected semantics.
There is no plan for binary libraries across epochs, how different implementations would interact with each other, how multiple crates requiring different versions would interact with each other if their public API crosses versions with different semantics,...
tialaramex 34 days ago [-]
I'd argue that one of the things we saw with Rust's Editions is that it significantly increases appetite for such language improvements.
There's a recent Reddit discussion which had zero pushback against changes but instead people who were disappointed that 2024 Edition won't land all the things they'd hoped for such as the improved Range types†
Without the Edition system we know from C++ that when you say "Why can't we have X?" the defenders appear to tell you that all choices except the status quo are impossible. They will do this regardless of how difficult such a change would be, and so this immediately deflates all interest in incremental improvement. It's a self-fulfilling prophecy.
But with Editions there's a lot of things we definitely can do. Once you open the door to that, it really drives enthusiasm and that enthusiasm can power some of the difficult changes you wouldn't even have considered in the C++ process.
† In Rust syntax like 1..=4 is compiled into the core type core::ops::RangeInclusive<i32> so this is a very natural way to express what you meant, and yet it's not a goofy special case it's just a normal literal like "Word" [an &'static str] or '€' [a char] or 1.2345 [a floating point defaulting to f64] -- however, we now realise these range types made some poor choices and if you made them today you'd make them implement Copy and IntoIterator, whereas the existing types implement Iterator and so in many cases cannot implement Copy. Ergonomically a Copy type would be much nicer here. Can we fix that? Well, the idea is, we make the new types (they exist in nightly Rust in the new core::range namespace but aren't stabilized) and then once we're sure they're exactly what we want we make a new Edition which changes the syntax transformation so that that literal is a core::range::RangeInclusive<i32> instead.
Doxin 29 days ago [-]
> I'm specifically thinking of the damage done as part of the python 2->3 changes
Though on the other hand, we can only the imagine of not doing those changes. I reckon it would be at least as bad as the 2->3 changes. At the very least I wouldn't want to go back to a world where bytes and strings were unified.
pjmlp 34 days ago [-]
C++ already had quite a few incompatible changes, your C++98 code will not compile in a C++23 mode compiler, if lucky enough to use one of such features.
eej71 34 days ago [-]
Ok, so could we agree that then doubling down on even more breaking changes is likely to create more codebases stuck at these "evolutionary bottlenecks"?
Whenever I hear someone say - "Gee, the only path ahead is a breaking change, sorry charlie!", what I really hear is - "I gave up thinking up a solution on how to do it an evolutionary way and I am lazy and just want to declare a revolution!".
There is always a pathway ahead on the evolutionary path. That's the premise at least.
jcranmer 34 days ago [-]
The real problem with ABI breaks isn't so much the need to (automatically or manually) migrate legacy programs, but that so much of modern software is comprised of multiple pieces that don't have any way to coordinate a lock-step ABI migration.
Take something like std::unique_ptr. The basic problem is that it is absolutely capable of being written such that std::unique_ptr<T> has the same ABI as T, but it doesn't (instead acting as a T*). It's pretty damn trivial to write the necessary thunking between the ABI versions, so it's also absolutely possible to have both old-ABI and new-ABI coexist (something that was a lot more difficult for the infamous Python 2->3 transition). And other than the internal changes needed to give it a new ABI, there's not really any user-visible API that would need to be migrated. But it still can't be done because std::unique_ptr is used on the interface between library A and B, and how is A supposed to know which version of the ABI B is expecting, when neither A nor B?
Tijdreiziger 35 days ago [-]
From the article:
> The goal is a tool-assisted migration of idiomatic code, not a fully automated migration of all code.
adrian_b 35 days ago [-]
That is just an acknowledgment of the fact that for something as complex as C++ a fully automated migration is unlikely to be achievable.
This does not mean that they will intentionally avoid to make possible a fully automated migration.
Normally the migration tools should be designed to attempt to do a fully automated migration, but whenever there are corner cases for which the effort to handle them completely automatically would not be worthwhile, then human intervention shall be required.
tialaramex 34 days ago [-]
Yes.
In C++ in particular one of the most obvious ways to write unmaintainable C++ which such automation couldn't be expected to migrate is via abuse of the pre-processor.
C++ retains the entire C pre-processor, which is a weird not-quite-language-agnostic text macro language on top of C. We can abuse this to write a different programming language, with different rules, and then at compile time the pre-processor transforms it into C++ which compiles.
I'm confident a migration tool could not always usefully translate this, and I'd argue that's not a failing of the tool so much as an inevitable consequence of writing this unmaintainable code.
owlstuffing 34 days ago [-]
> but this is the only right way of evolving a programming language when incompatible changes must be made: by providing tools that guarantee [it]
Yep. I'm not a particularly big fan of Google languages, but focusing on bridging legacy code is brilliant.
Would have been interesting if Oracle had done the same with Java shortly after acquiring it. Java is so hemmed in by legacy compatibility it basically can't make any significant language or VM changes without making harsh compromises. Of course, Oracle would likely have supported legacy Java indefinitely, and made a fortune supporting/licensing it.
atribecalledqst 35 days ago [-]
Have to say, it bothers me a little bit that they named it Carbon. I associate that term strongly with the old Carbon API from Apple.
Carbon was officially removed with 10.15 Catalina in 2019 - what's the statute of limitations on reusing a name like this?
chandlerc1024 34 days ago [-]
(Carbon lead here)
We tried other names, but we found collisions with essentially all of them. =/ We ended up picking a "least bad", and actually talked to a couple of folks familiar with the old usage to see if it was a worse collision than we realized. They weren't delighted but generally shrugged. So here we are. =/
It's definitely not perfect, but I think it's much more searchable than "C" or some other choices. Ultimately, I think its at least not bad enough to matter compared to the actual project.
phaedryx 34 days ago [-]
Why not create a word/name? e.g. Clojure
chandlerc1024 34 days ago [-]
We played with some, but none stuck.
A big goal was being short and easily pronounced, including by non-native English speakers, in a recognizable way from reading the text. That made the overwhelming majority of "fun" spellings not work well.
On one hand, I feel like we're just not as good at naming as Rust and Zig. Both of those names are :chefskiss:
On the other hand, Carbon does have a bunch of awesome puns waiting for us... So we've got that going for us. =D
thechao 33 days ago [-]
Hey, Chandler— this article says you're using "c++0x" concepts. Is that Indiana-style concepts?
chandlerc1024 32 days ago [-]
It was also called that.
But it isn't that we're directly using this, but that definition checked generics are fairly similar to the ideas in that series of proposals, and that led to the generics in Swift. Also closely related to the generics in Rust, etc.
theanonymousone 34 days ago [-]
Cloce
mrkpdl 35 days ago [-]
I feel the same, especially given how significant Carbon was to the revitalisation of Apple. Without carbon they likely would have lost several key developers in the Mac OS X transition, which all of their later success stems from.
krackers 34 days ago [-]
If this takes off, it will end up poisoning search results for anyone interested in retrocomputing.
Mond_ 35 days ago [-]
Meh. There's only so many good names out there. When there's little risk of confusion it's usually fine.
When something has been deprecated (like Carbon API which I didn't even know about) then it's imo completely fair game.
stevefolta 34 days ago [-]
Depends on your definition of "a good name". It seems like yours includes "must be a short English word", but doesn't include things like "is easily web-searchable" and "doesn't conflict with existing names". Throwing out the "short English word" criterion opens up a universe of names like "Wubulus" or "Flarnit".
jimbob45 34 days ago [-]
Shoulda named it “stroopwafel” in honor of Bjarne Stroustrup. Never mind that one is Danish and one is Dutch.
Also naming things is surely the most fitting use case for ChatGPT, no?
lproven 33 days ago [-]
> like Carbon API which I didn't even know about
Yeah, I hear that kind of excuse a lot.
Last time, someone had come up with some kind of new database-oriented language or something and they called it "Limbo".
Limbo is the programming language in Inferno. Plan 9 is what the Unix creators did next -- it's UNIX 2. Inferno is UNIX 3; it's what Plan 9 developed into.
It is the next language from the team that developed C.
It may not be widely-used but it's important, significant, and just as someone knowing their history makes me take their work more seriously, someone not knowing their history makes me think they have less to contribute, because they clearly haven't gone looking at prior art.
Ignorance is no excuse.
For someone to know what they're doing, they need to have at least a vague idea of whose shoulders they're standing on (as Isaac Newton put it). If they don't, they could be reinventing a wheel, and if they call things "struts" and "roundbuffers" and "spinny-pivots" then this says they don't know about "spokes" and "tyres" and "hubs". And making it hexagonal.
The flipside of this coin is making life easier for the community to search and learn.
anonnon 34 days ago [-]
I've never done any Mac programming, yet that was my reaction when I heard about it. Why would you choose that name for anything C++-related?
tbrownaw 34 days ago [-]
> Carbon was officially removed with 10.15 Catalina in 2019
Um, Catalina is a part of the Tomcat Java application server. Not sure what that has to do with Apple stuff.
rednafi 35 days ago [-]
I haven’t written C++ since I graduated and hopefully won’t have to, but this looks really good.
While I love Go, I feel like languages like Go and Rust didn’t become C++ killers because they expected everyone to jump on the bandwagon and abandon all their legacy code. That didn’t happen.
This approach of keeping the interop intact might just work.
sapiogram 35 days ago [-]
> I feel like languages like Go and Rust didn’t become C++ killers because they expected everyone to jump on the bandwagon and abandon all their legacy code
What gave you that impression? I'd say approximately 0 people from the Go community and at most 2 people from Rust expected that.
npalli 35 days ago [-]
>> I'd say approximately 0 people from the Go community
Quite literally that's what Rob Pike (golang co-creator) thought was going to happen
I was asked a few weeks ago, "What was the biggest surprise you encountered rolling out Go?" I knew the answer instantly: Although we expected C++ programmers to see Go as an alternative, instead most Go programmers come from languages like Python and Ruby. Very few come from C++.
sapiogram 35 days ago [-]
The person I responded to said that "they expected everyone to jump on the bandwagon and abandon all their legacy code".
That's not even close to what Rob Pike wrote on his blog.
pjmlp 34 days ago [-]
Rob Pike literally wrote that, as the OP points out.
What part of "Although we expected C++ programmers to see Go as an alternative" isn't clear enough?
sapiogram 34 days ago [-]
Nothing about that implies re-writing legacy code.
pjmlp 34 days ago [-]
Neither does this remark
> I feel like languages like Go and Rust didn’t become C++ killers because they expected everyone to jump on the bandwagon and abandon all their legacy code
Abandon is not a synonymous with rewrite, last time I checked a dictionary.
UncleEntity 34 days ago [-]
Perhaps the reason they didn't see many C++ coders is because of this?
sapiogram 34 days ago [-]
Because of what?
nicce 34 days ago [-]
Very odd to hear that from Golang co-creator. I don't really see how Go actually could compete with C++. Maybe in few cases, when people have accidentally chosen C++ incorrectly for their projects.
ncruces 34 days ago [-]
It's odd because at Google people still write networked servers in C++, which I'd argue, almost no one outside Google does?
Rob probably assumed Go could displace this. And it's not unreasonable to assume so, although it's closer to a better Java, than a better C++ for this.
Instead it displaced Python for this (not for research/NumPy/Colab stuff), maybe some Java (where it's easier to containerize).
And if it did displace C++, it was in greenfield projects, with non-C++ developers. So it didn't necessarily convert any C++ developers at all.
oddthink 34 days ago [-]
There's just so much C++ at Google that really has no business being C++ and falls into the "networked server" category. At least large swaths of the Search and Maps codebase, large chunks of flume (beam) batch pipelines, etc., etc. It's only historical accident and network-effect stickiness that keep that from being written in Java.
I could easily imagine him thinking Go could make inroads there. But then it took a very long time to get a flume port, and even then it didn't have half of the nice affordances that the C++ version did.
People say they need the efficiencies of C++, but IMHO they really don't when so much of the actual code time is spent slurping data from one sstable and writing it to the next sstable.
pjmlp 34 days ago [-]
Google themselves never were big Go adopters, the language adoption is mostly an external factor.
sapiogram 34 days ago [-]
Definitely agree. My theory is that Rob Pike was mentally stuck in the 90s, when C++ was still a common choice for non-performance-critical entreprise code.
o11c 34 days ago [-]
It really shouldn't be a surprise given that it took until 2022 to implement one of the basic features that everybody relies on constantly.
treyd 35 days ago [-]
Go isn't a C++ killer because there's very many usecases that C++ is commonly used for where having a runtime with a GC are absolute nonstarters. Other downsides it has is somewhat janky and slow FFI to C, limited control over how/when data structures are copied, and green threading making granular control over processes, forking, shared memory, etc harder. These are all common things that are expected in systems programing and ever presenting it as one is dishonest.
Rust doesn't suffer from any of those issues and had a feature set comparable to that of C++ from before even 1.0, so there is actually a tenable argument that it could be a C++ killer. That didn't quite manifest because there isn't a strong reason to rewrite existing large C++ codebases, but most kinds of new projects that would have been written in C++ in like 2010 have increasingly been done in Rust instead.
milesrout 34 days ago [-]
Go is a C++ killer in a different sense: there are things that people would have written in C++ (or often in C) that are today often written in Go instead. Not everything, obviously. But Go is a great Unix glue language for programs too big to write in Shell and often those would be written in C or C++ in the past.
baranul 34 days ago [-]
Agree with your statements. Go's history on its Wikipedia page[1] and other sources, explains one of the original points that led to its development was "dislike of C++", where it was causing them all kinds of problems at their office (explained in other sources).
Go was used to replace it, where possible and for their purposes, but not in a 1 to 1 way. That's neither a total replacement for C or C++, but rather a replacement in particular areas they deemed important. Carbon, on the other hand, is going that extra step where the creators of Go didn't want to.
Go doesn't even play in the same niche as C++, the part that could have been replaced by a managed language was long replaced by Java.
pjmlp 34 days ago [-]
With a much more modern type system.
ReflectedImage 33 days ago [-]
That's part of the point it's from the software engineering viewpoint of preferring simpler typing systems rather than the academic viewpoint of preferring more complex typing systems.
pjmlp 33 days ago [-]
Yeah, Go's simplicity approach has produced a wonderful piece of enginnering, without any design flaws. /s
DashAnimal 35 days ago [-]
As someone who uses C++ daily, very excited for Carbon. I really align with the goals they have set. Its a shame communities like r/cpp block any discussion of successor languages but I hope once this language starts gaining more momentum and nearing release, it will begin to market itself and get more attention.
For now, anyway. Let's see how long it survives. (+1 to being excited for Carbon. That shouldn't be a surprise though, considering I wrote this article.)
free_bip 34 days ago [-]
It's been removed by the mods unfortunately.
Mond_ 34 days ago [-]
Thank you r/cpp Reddit moderators for doing your part in keeping users safe.
saagarjha 34 days ago [-]
What, the memory safety wasn't enough?
bena 35 days ago [-]
It depends on the aim of the community. If it is to discuss the language, advancements, tips, etc. Then restricting talk of other languages makes sense. Otherwise the subreddit can become just talk of replacement languages.
Boldened15 34 days ago [-]
Subreddits sometimes split the difference by dedicating a single day of the week where something is allowed, eg. only allowing you to post questions on Fridays, or creating threads to consolidate off-topic discussion.
steveklabnik 34 days ago [-]
This is a great post! I'm very intrigued to see how Carbon ends up. I've also enjoyed reading about some of their implementation choices, being more data-driven.
Mond_ 34 days ago [-]
Thanks! Huge fan of your blog. :)
steveklabnik 34 days ago [-]
Thank you, that's very kind :)
hatwd 35 days ago [-]
Interesting language - its syntax looks like a mix of Rust and Go, with a few of its own idiosyncrasies to distinguish it from those languages.
> Carbon is a concentrated experimental effort to develop tooling that will facilitate automated large-scale long-term migrations of existing C++ code to a modern, well-annotated programming language with a modern, transparent process of evolution and governance model.
This is probably where Go and Rust fail to be C/C++ "successor" languages, as interop between those languages doesn't seem to be as seamless as Carbon aims to be.
Will keep an eye on its development!
habitue 34 days ago [-]
This immediately suggests what you want is a gradual migration process from one language through several intermediate languages to your target language.
i.e. instead of a big bang C++ -> Carbon migration, instead you want something like:
(Basically fake names for more gradual transitional intermediate languages, assuming Google would like to have everything in Rust eventually.)
The key idea of "targeting automated migrations" makes this kind of thing feasible
Mond_ 34 days ago [-]
Yup! I didn't work this into the post, but this is in fact exactly (more or less) the goal.
Chandler even explicitly says "Maybe we can eventually convert some future version of Carbon to Rust, who knows."
Whether that is going to happen (or is viable) is unclear for now, but it's pretty clear that a migration of C++ to (idiomatic) Carbon will probably involve at least a few steps.
IshKebab 34 days ago [-]
That's not really possible. Rust isn't "C++ but better". It has several design decisions that make it not really compatible with C++, e.g. no move constructors & the use of fat pointers.
zozbot234 34 days ago [-]
The main obstacle to making Rust a "C++ but better" language is arguably the half-baked design of its support for pinned data. (Because "pinned", i.e. location-sensitive objects that can't simply be memcpy'd elsewhere are ubiquitous in idiomatic C/C++.) But there are several language proposals to mitigate this in the future, though for most of these a full transition will require an edition change in order not to break backward-compat.
As for the use of 'fat' vs. 'thin' pointers for dynamic dispatch, a C++-style vtable ptr is just a &'static VTableStruct, where VTableStruct is a dictionary of functions. The anyhow crate is a good example of how you can implement that particular approach.
habitue 34 days ago [-]
I think I'd say "it's hard, and would require making some opinionated decisions as to what to convert something to" vs "not possible". The bar is pretty high for "not possible"
noelwelsh 35 days ago [-]
What I've seen of Carbon looks really good. I feel they're trying to do something that is hugely ambitious, and so perhaps unlikely to succeed, but I love the vision.
As I don't have a massive C++ codebase I have no stake in Carbon's success, but I think language improvements are some of the most significant steps we, as an industry can take (language improvements are basically the only way we can rule out entire classes of bugs) and I want our industry to improve.
mhh__ 34 days ago [-]
I highly doubt I'll ever use Carbon but I'm really enjoying their statements of their ideology and choices on various matters in the github repo.
Remnant44 34 days ago [-]
I thought this was a well written and fair article. My preference would be that C++ is not "forked", but I understand the reasons for the impasse and consequences.
I found the part about member access operators interesting.
I've written C++ for 25 years, and although I've used "pointer"-to-member(function/fields) ability many times (indeed, this is how you bind signals to your member slots statically in QT!), I had no idea that it could both be null, and that -1 was used for a 'null' 'pointer' in this instance.
Good example about how large the language is, with so many weird dark darks. The number of air quotes needed above is pretty illuminating on its own.
Neywiny 34 days ago [-]
I've heard that null pointers were greatly regretted. My favorite example of their issue is that there exists embedded devices with memory at address 0. Even worse, it's ITCM (instruction memory) so one might think that on startup you may want to copy a block of program code into it for lower latency (such as ISRs). Maybe you want to point the DMA to that while the CPU is busy initializing other things. But you can't. Because the driver code checks for... A null pointer where that's defined as 0.
Also, anything that doesn't do a null check (failed malloc anyone?) ends up writing to that memory. Good luck if you need it for something.
Similarly when I was writing a simulator for the chip I ended up commenting out that block for a lot of it because it was masking so many issues as other issues.
Unsure if newer generations moved it to somewhere else.
willtemperley 34 days ago [-]
Working on a Swift project which is integrating an irreplaceable C++ library, this is really interesting.
My immediate thoughts are: would it be possible to interface with Carbon from Swift, delegating things like memory management of C++ objects to Carbon?
comex 34 days ago [-]
I like the idea of Carbon and I hope to use it in the future. But wow,
> We consider it a non-goal to support legacy code for which the source code is no longer available
is such a pejorative way to talk about ABI stability.
Yes, supporting code you lost the source for is one use case for ABI compatibility. I guess it’s a real need that some people have. But it’s by far the most unsympathetic use case. It reeks of antiquated development practices. Plus, binaries with lost source are inherently a ticking time bomb since you won’t be able to port the code to new platforms or make any fixes.
But what about all the other use cases for ABI stability?
What if your vendor has the source code but doesn’t want to give it to you?
What if you do have the source, but you want to be able to update libraries without rebuilding every executable that depends on them? Especially valuable if you’re building an operating system, Linux or otherwise. (There’s a reason that Swift spent so much effort on ABI compatibility.)
What if you want to let people build dynamically-loadable plugins? (This one at least gets a brief mention in the Carbon goals document, albeit under the "legacy compiled libraries" section.)
What if you just want to save build time via pre-built libraries?
Don't get me wrong, I'm not against Carbon's decision to make ABI stability a non-goal. There are real tradeoffs in making ABI stability work. I'm just saying, if you're going to explain why you don't support it, don't dismiss the use cases for it by picking such a poor exemplar.
chandlerc1024 34 days ago [-]
(Carbon lead for context)
I think you're missing the point of the example in a major way...
Personally, I care about finding ways to support a bunch of these other use cases where we can and in good ways. Especially things like build times, dynamically loaded plugins, and vendored libraries. I think we can and _need_ to find reasonable solutions to those.
The specific example is the only one called out because it's the only one that is fundamentally a non-goal.
The other use cases I think we can find good ways to support, but they may not look like a stable ABI for the entire language. Maybe a designated subset or surface which is designed to have long-term link compatibility in some way, etc. Removing that specific use case is important because it opens up more candidate solutions in the space.
And to be clear, this isn't a straw-person use case. This specific wording and use case was a response to that use case being actively supported by the C++ committee on several occasions.
zozbot234 34 days ago [-]
There is a stable ABI for these purposes, it's called the C ABI. Or you could use modern "naked functions" support and set up your own function prologues, epilogues and parameter passing in assembly code.
pjmlp 34 days ago [-]
Its clunkiness in modern times is why I rather deal with COM boilerplate, or OS IPC.
We deserve something better than 1970's view of OS ABIs.
JackC 34 days ago [-]
I think the distinction is these usecases require an ABI, but not the same ABI? You could in theory migrate both sides to Carbon for all of these, if it was worth it.
rvense 34 days ago [-]
I know enough C++ to understand that "member access operator" example, but not enough to have ever seen that before, and my first thought is just "big yikes".
I'm sure there are cases where this will mean you can write more general and flexible library code, but is it really worth the cost? C++ just seems so full of this three-starred nonsense.
spacechild1 34 days ago [-]
Altough taking pointers to member functions is much more common, pointers to member variables certainly have their use cases. Here's on example: binding C++ member variables to a scripting language.
// read and write variable
player_type["speed"] = &player::speed;
Should a modern language support member access operators? Probably not, honestly.
Is it really impressive that Carbon found an abstraction to support it that neatly fits into a Rust-like trait system and generalizes all sorts of things?
Yes.
rvense 34 days ago [-]
Now, that I won't argue with at all! There are some big-brained people that will find ways forward for various users of C++: some will migrate, some will create impressive tooling, and some, likely will evolve the language as it is - in the grand scheme of things, C++98 becoming an ambiguous term isn't too far off. These are all economic necessities based on just how much C++ has been deposited in the world over the past 35 years.
But I also can't help but feel like maybe the big-brainedness required for all this machinery around the language is of sort of the same stock that created the complexity which made the machinery necessary in the first place? Is this a cultural issue? Is the biggest problem in computer science actually that we have too many smart people?
drysine 34 days ago [-]
Don't you think that it is much much simpler than, for example, any of the meta-programming stuff in Python?
ch33zer 34 days ago [-]
It's quite handy in interpreters: read the instruction, store a pointer to the referenced field, then operate on it. That's the only time I used it at least.
binary132 34 days ago [-]
“We should simply replace the Committee with Google” is not a compelling reason to switch to Carbon. It’s actually kind of the opposite of that.
Mond_ 33 days ago [-]
Have you read the section of the article addressing this concern?
theanonymousone 34 days ago [-]
Based on what I read, can we say Carbon is the Python 3 of the C++ ecosystem?
HI HACKER NEWS! Excited to see you!!! Ping me if you find typos.
(Also jeez, writing this took way too long and I spent too much time editing it trying to cram everything from Cpp2, the Google governance issue, member access operators as a case study, some historical bits, etc. into the post.)
EDIT: One of the most interesting things for me is that (so far) no one complained about the lack of Carbon code examples.
"Do I need to get into the history and the structural process of the C++ Standard Committee? I'm sure everyone already knows about that, right?"
There is a small typo in this sentence: "As long as we’re willing to say that Carbon is is about reducing the reliance on the C++ Standard Committee ...". There are two "is".
The issue is that I had to cram them all in somehow, without making the article excessively long. I'm open to criticism, though!
Hopefully Carbon will succeed to achieve this goal.
C++ is definitely in a tough spot on the evolutionary trail. But the idea that the only path ahead is through incompatible changes seems likely to produce the same harmful effects.
Instead C++ has added all sorts of slightly incompatible features every three years since 2011, and is expected to do so again, periodically one of these incompatible changes is especially troublesome for important people and, like a naughty toddler, the committee promises not to do that again.
Yet, despite these incompatible changes which might have accidentally larger consequences than expected, for fear of the consequences other changes which were known to be incompatible but seem "worth it" are rejected. The worst of both worlds.
There is no plan for binary libraries across epochs, how different implementations would interact with each other, how multiple crates requiring different versions would interact with each other if their public API crosses versions with different semantics,...
There's a recent Reddit discussion which had zero pushback against changes but instead people who were disappointed that 2024 Edition won't land all the things they'd hoped for such as the improved Range types†
Without the Edition system we know from C++ that when you say "Why can't we have X?" the defenders appear to tell you that all choices except the status quo are impossible. They will do this regardless of how difficult such a change would be, and so this immediately deflates all interest in incremental improvement. It's a self-fulfilling prophecy.
But with Editions there's a lot of things we definitely can do. Once you open the door to that, it really drives enthusiasm and that enthusiasm can power some of the difficult changes you wouldn't even have considered in the C++ process.
† In Rust syntax like 1..=4 is compiled into the core type core::ops::RangeInclusive<i32> so this is a very natural way to express what you meant, and yet it's not a goofy special case it's just a normal literal like "Word" [an &'static str] or '€' [a char] or 1.2345 [a floating point defaulting to f64] -- however, we now realise these range types made some poor choices and if you made them today you'd make them implement Copy and IntoIterator, whereas the existing types implement Iterator and so in many cases cannot implement Copy. Ergonomically a Copy type would be much nicer here. Can we fix that? Well, the idea is, we make the new types (they exist in nightly Rust in the new core::range namespace but aren't stabilized) and then once we're sure they're exactly what we want we make a new Edition which changes the syntax transformation so that that literal is a core::range::RangeInclusive<i32> instead.
Though on the other hand, we can only the imagine of not doing those changes. I reckon it would be at least as bad as the 2->3 changes. At the very least I wouldn't want to go back to a world where bytes and strings were unified.
Whenever I hear someone say - "Gee, the only path ahead is a breaking change, sorry charlie!", what I really hear is - "I gave up thinking up a solution on how to do it an evolutionary way and I am lazy and just want to declare a revolution!".
There is always a pathway ahead on the evolutionary path. That's the premise at least.
Take something like std::unique_ptr. The basic problem is that it is absolutely capable of being written such that std::unique_ptr<T> has the same ABI as T, but it doesn't (instead acting as a T*). It's pretty damn trivial to write the necessary thunking between the ABI versions, so it's also absolutely possible to have both old-ABI and new-ABI coexist (something that was a lot more difficult for the infamous Python 2->3 transition). And other than the internal changes needed to give it a new ABI, there's not really any user-visible API that would need to be migrated. But it still can't be done because std::unique_ptr is used on the interface between library A and B, and how is A supposed to know which version of the ABI B is expecting, when neither A nor B?
> The goal is a tool-assisted migration of idiomatic code, not a fully automated migration of all code.
This does not mean that they will intentionally avoid to make possible a fully automated migration.
Normally the migration tools should be designed to attempt to do a fully automated migration, but whenever there are corner cases for which the effort to handle them completely automatically would not be worthwhile, then human intervention shall be required.
In C++ in particular one of the most obvious ways to write unmaintainable C++ which such automation couldn't be expected to migrate is via abuse of the pre-processor.
C++ retains the entire C pre-processor, which is a weird not-quite-language-agnostic text macro language on top of C. We can abuse this to write a different programming language, with different rules, and then at compile time the pre-processor transforms it into C++ which compiles.
I'm confident a migration tool could not always usefully translate this, and I'd argue that's not a failing of the tool so much as an inevitable consequence of writing this unmaintainable code.
Yep. I'm not a particularly big fan of Google languages, but focusing on bridging legacy code is brilliant.
Would have been interesting if Oracle had done the same with Java shortly after acquiring it. Java is so hemmed in by legacy compatibility it basically can't make any significant language or VM changes without making harsh compromises. Of course, Oracle would likely have supported legacy Java indefinitely, and made a fortune supporting/licensing it.
Carbon was officially removed with 10.15 Catalina in 2019 - what's the statute of limitations on reusing a name like this?
We tried other names, but we found collisions with essentially all of them. =/ We ended up picking a "least bad", and actually talked to a couple of folks familiar with the old usage to see if it was a worse collision than we realized. They weren't delighted but generally shrugged. So here we are. =/
It's definitely not perfect, but I think it's much more searchable than "C" or some other choices. Ultimately, I think its at least not bad enough to matter compared to the actual project.
A big goal was being short and easily pronounced, including by non-native English speakers, in a recognizable way from reading the text. That made the overwhelming majority of "fun" spellings not work well.
On one hand, I feel like we're just not as good at naming as Rust and Zig. Both of those names are :chefskiss:
On the other hand, Carbon does have a bunch of awesome puns waiting for us... So we've got that going for us. =D
But it isn't that we're directly using this, but that definition checked generics are fairly similar to the ideas in that series of proposals, and that led to the generics in Swift. Also closely related to the generics in Rust, etc.
When something has been deprecated (like Carbon API which I didn't even know about) then it's imo completely fair game.
Also naming things is surely the most fitting use case for ChatGPT, no?
Yeah, I hear that kind of excuse a lot.
Last time, someone had come up with some kind of new database-oriented language or something and they called it "Limbo".
Limbo is the programming language in Inferno. Plan 9 is what the Unix creators did next -- it's UNIX 2. Inferno is UNIX 3; it's what Plan 9 developed into.
It is the next language from the team that developed C.
It may not be widely-used but it's important, significant, and just as someone knowing their history makes me take their work more seriously, someone not knowing their history makes me think they have less to contribute, because they clearly haven't gone looking at prior art.
Ignorance is no excuse.
For someone to know what they're doing, they need to have at least a vague idea of whose shoulders they're standing on (as Isaac Newton put it). If they don't, they could be reinventing a wheel, and if they call things "struts" and "roundbuffers" and "spinny-pivots" then this says they don't know about "spokes" and "tyres" and "hubs". And making it hexagonal.
The flipside of this coin is making life easier for the community to search and learn.
Um, Catalina is a part of the Tomcat Java application server. Not sure what that has to do with Apple stuff.
While I love Go, I feel like languages like Go and Rust didn’t become C++ killers because they expected everyone to jump on the bandwagon and abandon all their legacy code. That didn’t happen.
This approach of keeping the interop intact might just work.
What gave you that impression? I'd say approximately 0 people from the Go community and at most 2 people from Rust expected that.
Quite literally that's what Rob Pike (golang co-creator) thought was going to happen
https://commandcenter.blogspot.com/2012/06/less-is-exponenti...
I was asked a few weeks ago, "What was the biggest surprise you encountered rolling out Go?" I knew the answer instantly: Although we expected C++ programmers to see Go as an alternative, instead most Go programmers come from languages like Python and Ruby. Very few come from C++.
That's not even close to what Rob Pike wrote on his blog.
What part of "Although we expected C++ programmers to see Go as an alternative" isn't clear enough?
> I feel like languages like Go and Rust didn’t become C++ killers because they expected everyone to jump on the bandwagon and abandon all their legacy code
Abandon is not a synonymous with rewrite, last time I checked a dictionary.
Rob probably assumed Go could displace this. And it's not unreasonable to assume so, although it's closer to a better Java, than a better C++ for this.
Instead it displaced Python for this (not for research/NumPy/Colab stuff), maybe some Java (where it's easier to containerize).
And if it did displace C++, it was in greenfield projects, with non-C++ developers. So it didn't necessarily convert any C++ developers at all.
I could easily imagine him thinking Go could make inroads there. But then it took a very long time to get a flume port, and even then it didn't have half of the nice affordances that the C++ version did.
People say they need the efficiencies of C++, but IMHO they really don't when so much of the actual code time is spent slurping data from one sstable and writing it to the next sstable.
Rust doesn't suffer from any of those issues and had a feature set comparable to that of C++ from before even 1.0, so there is actually a tenable argument that it could be a C++ killer. That didn't quite manifest because there isn't a strong reason to rewrite existing large C++ codebases, but most kinds of new projects that would have been written in C++ in like 2010 have increasingly been done in Rust instead.
Go was used to replace it, where possible and for their purposes, but not in a 1 to 1 way. That's neither a total replacement for C or C++, but rather a replacement in particular areas they deemed important. Carbon, on the other hand, is going that extra step where the creators of Go didn't want to.
[1]: https://en.wikipedia.org/wiki/Go_(programming_language)
For now, anyway. Let's see how long it survives. (+1 to being excited for Carbon. That shouldn't be a surprise though, considering I wrote this article.)
> Carbon is a concentrated experimental effort to develop tooling that will facilitate automated large-scale long-term migrations of existing C++ code to a modern, well-annotated programming language with a modern, transparent process of evolution and governance model.
This is probably where Go and Rust fail to be C/C++ "successor" languages, as interop between those languages doesn't seem to be as seamless as Carbon aims to be.
Will keep an eye on its development!
i.e. instead of a big bang C++ -> Carbon migration, instead you want something like:
C++ -> C+++ -> Carbon-- -> Carbon -> Carbon++ -> Rust
(Basically fake names for more gradual transitional intermediate languages, assuming Google would like to have everything in Rust eventually.)
The key idea of "targeting automated migrations" makes this kind of thing feasible
Chandler even explicitly says "Maybe we can eventually convert some future version of Carbon to Rust, who knows."
Whether that is going to happen (or is viable) is unclear for now, but it's pretty clear that a migration of C++ to (idiomatic) Carbon will probably involve at least a few steps.
As for the use of 'fat' vs. 'thin' pointers for dynamic dispatch, a C++-style vtable ptr is just a &'static VTableStruct, where VTableStruct is a dictionary of functions. The anyhow crate is a good example of how you can implement that particular approach.
As I don't have a massive C++ codebase I have no stake in Carbon's success, but I think language improvements are some of the most significant steps we, as an industry can take (language improvements are basically the only way we can rule out entire classes of bugs) and I want our industry to improve.
I found the part about member access operators interesting.
I've written C++ for 25 years, and although I've used "pointer"-to-member(function/fields) ability many times (indeed, this is how you bind signals to your member slots statically in QT!), I had no idea that it could both be null, and that -1 was used for a 'null' 'pointer' in this instance.
Good example about how large the language is, with so many weird dark darks. The number of air quotes needed above is pretty illuminating on its own.
Also, anything that doesn't do a null check (failed malloc anyone?) ends up writing to that memory. Good luck if you need it for something.
Similarly when I was writing a simulator for the chip I ended up commenting out that block for a lot of it because it was masking so many issues as other issues.
Unsure if newer generations moved it to somewhere else.
My immediate thoughts are: would it be possible to interface with Carbon from Swift, delegating things like memory management of C++ objects to Carbon?
> We consider it a non-goal to support legacy code for which the source code is no longer available
is such a pejorative way to talk about ABI stability.
Yes, supporting code you lost the source for is one use case for ABI compatibility. I guess it’s a real need that some people have. But it’s by far the most unsympathetic use case. It reeks of antiquated development practices. Plus, binaries with lost source are inherently a ticking time bomb since you won’t be able to port the code to new platforms or make any fixes.
But what about all the other use cases for ABI stability?
What if your vendor has the source code but doesn’t want to give it to you?
What if you do have the source, but you want to be able to update libraries without rebuilding every executable that depends on them? Especially valuable if you’re building an operating system, Linux or otherwise. (There’s a reason that Swift spent so much effort on ABI compatibility.)
What if you want to let people build dynamically-loadable plugins? (This one at least gets a brief mention in the Carbon goals document, albeit under the "legacy compiled libraries" section.)
What if you just want to save build time via pre-built libraries?
Don't get me wrong, I'm not against Carbon's decision to make ABI stability a non-goal. There are real tradeoffs in making ABI stability work. I'm just saying, if you're going to explain why you don't support it, don't dismiss the use cases for it by picking such a poor exemplar.
I think you're missing the point of the example in a major way...
Personally, I care about finding ways to support a bunch of these other use cases where we can and in good ways. Especially things like build times, dynamically loaded plugins, and vendored libraries. I think we can and _need_ to find reasonable solutions to those.
The specific example is the only one called out because it's the only one that is fundamentally a non-goal.
The other use cases I think we can find good ways to support, but they may not look like a stable ABI for the entire language. Maybe a designated subset or surface which is designed to have long-term link compatibility in some way, etc. Removing that specific use case is important because it opens up more candidate solutions in the space.
And to be clear, this isn't a straw-person use case. This specific wording and use case was a response to that use case being actively supported by the C++ committee on several occasions.
We deserve something better than 1970's view of OS ABIs.
I'm sure there are cases where this will mean you can write more general and flexible library code, but is it really worth the cost? C++ just seems so full of this three-starred nonsense.
Is it really impressive that Carbon found an abstraction to support it that neatly fits into a Rust-like trait system and generalizes all sorts of things?
Yes.
But I also can't help but feel like maybe the big-brainedness required for all this machinery around the language is of sort of the same stock that created the complexity which made the machinery necessary in the first place? Is this a cultural issue? Is the biggest problem in computer science actually that we have too many smart people?