Not necessarily, you still need backups or snapshots especially on home directory in case software have a nasty bug like deleting your data.
Not necessarily, you still need backups or snapshots especially on home directory in case software have a nasty bug like deleting your data.
Yup and I am getting sick of hearing this even on Arch Linux. Like, mofo, you could literally run a snapshot or backup before upgrading, don’t blame us if you’re yoloing your god damn computer. Windows have exactly the same problem too and this is why we have backups. Christ.
On my Arch Linux Install, I literally have a Pacman Hook that would forcibly run backup and verify the said backup before doing a system-wide update.
That one was an old documentation that some of the Chinese folks actually document a lot of quirks related to X11 protocol. I paid about $6000 for translator to work on translating that doc to English and I use it to build my own GUI Toolkit on Linux that I still use to this day.
How it really works:
mpf_t temperature;
It’s arbitrary sized floating precision number provided in LibGMP and you can find more information about mpf_t here.
Oof, sorry. :( I had hoped that they sorted it out by then…
Maybe try installing swiftshader?
Yep, and if open source licensing could be revoked on a whim, you can imagine the chaos that ensued. That would be my understanding as well, old version that have MPL license is perfectly fine to fork off, newer version might not be as it is under a different license. One of the reason why I liked Apache License is that it have make it explicitly clear that it’s irrevocable whereas MPL it is operating on an assumption that it’s not revocable. The most fundamental problem with the legal system in USA is that no law is “set in stone” and leaving things to assumption is open to reinterpretation by the judge who may have sided against you. (Hell, Google vs Oracle on Copyrighted API is still on case-to-case basis, so take it as you will.)
Disclaimer: I am not a lawyer. I just share what I learned from Legal Eagle youtube and few other sources.
I concur, there was a few problems that might come up on various platforms like Windows not implementing C11 standard threads and other stuff, you would instead use TinyCThread library that works like a polyfill.
All problems and challenges are workable, if the problem with Debian is out of date library, you could set up CI/CD for release build that rebuild your software when update occurs and static link the updated dependencies.
Back to your point, if they didn’t design their code and architecture to be multiplatform like in C, they need to re-evaluate their design decisions.
I would spend it on language translation basically, paying someone to translate international documentations on things that aren’t documented in USA no matter where you look.
It just rooted back to my frustration when I was trying to fill in missing implementation details on projects like Skia (at the time it lacked support for Vulkan.) My very fundamental core belief is that for core libraries like say, Skia, Neural Net Framework, and other crucial projects like that should offer a way in C API that allows every type and implementation to be extended upon by any other language that can interface with C API by providing your own VTable or whatnot.
One of the approach I do for my GUI Toolkit written in C (specifically on Linux to replace QT and GTK) was making a single inheritance object oriented programming in C.and if you insert the base class type structure at the top of your custom struct type and provide your own VTable for those objects, you can readily extend the underlying library natively in whatever programming language you use assuming it can talks to C API in a complete sense.
Let me know if you want a demonstration of this, I would be happy to find the time to set up a small sample to give you the idea on how it’s done.
And I am also aware of the criticisms on those approach, verbosity of attempting to implement object oriented programming in C is kind of absurd and the API coverage would balloon. That is largely why I work on a Compiler-Generator Framework specifically to address the challenges by allowing me to add dialects on top of C Language such as generic, object oriented programming, and various dialects. I brought C closer to C# in term of syntax and features and at the end of compilation, it still produces readable C language code output and it also generates what I called an FFI-JSON. It’s essentially a JSON file that describes all of the types used in a C project, the sizes of integers/floating points, structure types and it’s fields/offsets/sizes comments, and function declarations. It’s done in a way that you could read the JSON file and generate your programming language binding library saving you weeks of work.
I don’t think so on the extensibility aspect alone, some of the Rust syntax/trait does not map well to other languages when other languages attempts to extend Rust library. I write C code in a way that it would be extensible from any languages.
Yep, biggest reason why I chose C language is Foreign Function Interface. Code you write for C is more than likely to be usable in just about any other languages.
I would most likely be using C11 for threads.h
and stdatomic.h
for foreseeable future, the problem with using the latest and greatest standard is the risk of compiler not supporting it, so I would likely wait at least 5 years before switching to C23 sometime in 2028 or 2029. There was a bit of a controversy around optional bound checking in C11 that they end up removing/deprecating it, I am sure C23 would have something going on.
I don’t plan on using #embed or constexpr in favor of maintaining common C programming practices, language familiarity is still an important factor to thriving project as much as people nag on me to rewrite everything in Rust or C++.
deleted by creator
deleted by creator
Because code that they released to public are usually MIT licensed like Dotnet Core Runtime. They just have a long history of hating GPL licensed software.
I tried to use it, but it have some big issues in reliability, because at the end of the day, despise the dataset it’s trained on, it’s still something I describes as a “language interpolation.”
It sometime make TERRIBLE recommendations for which tools/libraries I should explore, because it assumes that those libraries might have support. Those libraries never does and so I wasted weeks on it. (It doesn’t help that both code and project are undocumented.)
So after that experience, I demote ChatGPT usefulness to just “cleaning up pre-written documentation so it sounds better.” That’s it.
Sure until you can’t with flatpak. Flatpak does not safeguard against system binaries and there are always risks associated with that.
Honestly I think I am going to move on from Programming.dev, it’s filled with script kiddie like you. Good lord.
Fuck y’all. Good evening.