

I’d be very skeptical of claims that Debian maintainers actually audit the code of each piece of software they package. Perhaps they make some brief reviews, but actually scrutinizing every line for hidden backdoors is just not feasible.
I’d be very skeptical of claims that Debian maintainers actually audit the code of each piece of software they package. Perhaps they make some brief reviews, but actually scrutinizing every line for hidden backdoors is just not feasible.
Any accessibility service will also see the “hidden links”, and while a blind person with a screen reader will notice if they wonder off into generated pages, it will waste their time too. Especially if they don’t know about such “feature” they’ll be very confused.
Also, I don’t know about you, but I absolutely have a use for crawling X, Google maps, Reddit, YouTube, and getting information from there without interacting with the service myself.
I would love to think so. But the word “verified” suggests more.
That makes me think, perhaps, you might be able to set it to exec("stuff") or True
…
while allowing legitimate users and verified crawlers to browse normally.
What is a “verified crawler” though? What I worry about is, is it only big companies like Google that are allowed to have them now?
I agree that it’s difficult to enforce such a requirement on individuals. That said, I don’t agree that nobody cares for the content they post. If they have “something cool they made with AI generation” - then it’s not a big deal to have to mark it as AI-generated.
An intelligence service monitors social media. They may as well have said, “The sky is blue.”
More interesting is,
Sharing as a force multiplier
– OpenAI
Do you know of a provider is actually private? The few privacy policies I checked all had something like “We might keep some of your data for some time for anti-abuse or other reasons”…
Too bad that’s based on macros. A full preprocessor could require that all keywords and names in each scope form a prefix code, and then allow us to freely concatenate them.
It works fine for me on Hyprland.
That is why I use just int main(){...}
without arguments instead.
I don’t think any kind of “poisoning” actually works. It’s well known by now that data quality is more important than data quantity, so nobody just feeds training data in indiscriminately. At best it would hamper some FOSS AI researchers that don’t have the resources to curate a dataset.
And who hasn’t contributed any code to this particular repo (according to github insights).
That said, you can use a third party service only for sending, but receive mail on your self-hosted server.
How does it compare to NixOS?