Lol that’s actually hilarious. So but, why not comment on your posts too? Each post is just sitting there with an empty comments section.
Lol that’s actually hilarious. So but, why not comment on your posts too? Each post is just sitting there with an empty comments section.
I recently put a dozen hours into Witcher 3 while using my steam deck on a couple long flights. I’m pretty sure it synced correctly when I finally got home and connected to wifi. Maybe it didn’t work at one time, but I’d be surprised if it still doesn’t.
I don’t see any of The Avalanches on there. All their stuff is good, but if you haven’t listened to them, probably start with the original Since I Left You album.
For reference, I discovered J Dilla - Donuts when trying to find more stuff like The Avalanches.
That sounds less like a skill and more like a very unfortunate freak accident.
Or just the form of a crab in general! Carcinisation is so weird, but apparently evolution sometimes goes “Let’s just do crab again, that shit was 👌”.
My post was talking about where I thought anti-cheat would need to end up in order to be effective without being invasive, not about the state of anti-cheat now. I gave VAC as an example of a cross-game platform for cheat detection, and thus where valve would most likely stick something like this.
I was originally going to compare it to a social score, yes, but it differs in that it wouldn’t be a rating that other players would have direct influence over.
If by “hire more people” you mean “train an AI”, then yes, definitely!
Anti-cheat is an arms race. We just find ourselves at a point where the arms race has progressed to the point where the best known strategy for securing a play session means ostracising custom hw/kernel configurations.
But I have to think it’s only a matter of time before even that’s not enough, (since there already exist ways around kernel level anticheat, including AI-based techniques that are entirely undetectable).
My guess is the logical conclusion involves a universal reputation based system, where you have an account with some 3rd party system (maybe VAC) that persists across all games you play. It will watch your gameplay, and maintain a (probably hidden) “risk of cheating” score. Then matchmaking for each game will use this score to always pair you against other accounts with a similar score.
Actually, it might not be a “risk of cheating” score so much as a “fun to play with” score. From a gameplay perspective, it’s just as fun to play against a highly skilled non-cheating human, as it is a bot that plays identically. But it’s less fun to play against a bot that uses info or exploits that even the best non-cheating players don’t have access to (ex. wallhacks). So really, the system could basically maintain some playstyle-profile for each player, and matchmaking wouldn’t be skill-based, but rather it would attempt to maximize the “fun” of the match-up. If a player is constantly killing people unrealistically fast, or people who play with them tend to drop early, this would degrade their “fun” score and they would tend to be matched only with other unfun players.
I think this would be the only practical way to fight cheating without even more invasive methods that will involve just deanonymizing players (which I think some studio will inevitably try in the near future).
I disagree that it’s the same for multiple reasons: first off the project and telemetry were never profit-driven. Their goal was always to use modern methods of software development to make the software better.
The fact is, these days all for-profit projects gather a ton of info without asking, and then use that data to inform their development and debugging (and sell, but that’s irrelevant to my point). To deny open source software the ability to even add the option of reporting telemetry is to ask them to make a better product than for-profit competition, with fewer tools at their disposal, and at a fraction of the pay (often on a voluntary basis). That’s just unreasonable.
Which is why the pushback wasn’t that they were using telemetry, it was that they were going to use Google Analytics and Yandex, which are “cheap” options, but are obviously for-profit and can’t be trusted as middlemen. They heard the concern over that and decided to steer away to a non-profit solution.
But as a software dev and a Linux user, I often wish I could easily create bug reports using open source, appropriately anonymized telemetry reporting tools. I want to make making a better system for me to use as easy as possible for the saints that are volunteering their time.
As for the issues in tenacity, it was likely specific to what I was doing. I was rapidly opening and closing a lot of small audio clips, and saving them to network mounted dirs under different names. I remember I had issues with simple stuff like keyboard shortcuts to open files, and I had to manually use the mouse to select a redundant option every single time (don’t recall what it was), and I think it would just crash trying to save to the network mounted dir, so I had to always save locally and copy over manually. So I just switched back and continued my work.
Afaik, back when it all went down, they heard the public reaction about the telemetry thing and completely reversed course. On top of that, many distros would be sure to never distribute a build with telemetry enabled anyway. So there has never been any cause for concern. Would love to be proven wrong, though.
Also, Audacity is handy, but it’s not perfect, and I’ll gladly use a better alternative. But the last time I tried Tenacity, it had a bunch of little differences that made the tool just a bit harder to use. So I still default to audacity.
Which is a good reminder to everyone to support your local Lemmy instances.
“Look man, I appreciate the concern, but really, I’m fine. I just prefer not to socialize.” Then divert your attention to something else.
Or you could pull an SGDQ and go with the ol’ “I would really prefer it if you would be quiet.”
Yeah, but I think it can feel too much like a circle jerk around here sometimes. I get that people want to win over new users, but some of it goes too far I think. The fact is Linux isn’t perfect, and while no OS is, there are some critical things you can do on Windows that are still a pain in the ass on Linux. Some of that is a vendor/proprietary software problem, but a good chunk of it is just people being willing to overlook a thin layer of jank in their normal workflows.
I think we’d all be better off to all acknowledge and clean up the jank rather than try to pretend it’s fine as is.
There was a time when there was an annual “Linux Sucks” presentation that I liked because it was a roundup of candid, yet constructive criticism of Linux (and then at some point the person running that went off the deep end and started yelling about woke agendas).
I wouldn’t mind there being a whole community devoted to pointing out shit that is poorly designed or just broken when running linux, and we as a community then try to fix them or find workarounds.
But as others have pointed out, that community isn’t a community, it’s literally just one account hanging out by themselves.
On top of all the other informative comments answering a plethora of questions you understandably have when entering the Linux ecosystem, I want to express: don’t feel like you need to learn all this stuff if it doesn’t interest you, or otherwise turns you off the idea of Linux.
It’s perfectly fine to ignore all the terminology, install whatever new-user friendly version of Linux you can, and just start using it. If it’s not to your taste, or it asks too much of you, maybe try a different one. But I’m of the firm belief that immediately inundating a new user with a bunch of new vocab and unfamiliar workflows is the mark of a bad new user experience, and you shouldn’t feel required to put up with that.
The fact is, unlike MSFT who has a bunch of terminology internal to the windows dev teams, Linux is developed in the open, so all the terminology leaks into the user world too. And you just need to get good at saying, “if this doesn’t help me use my PC better for what I need it to do, I don’t care”.
Yeah, I actually kinda like the idea of a whole internet where avoiding virality is somehow built into the system. But I think such a system would naturally evolve into a p2p solution. You couldn’t stop people from taking and rehosting content on their own servers.
And my point was directly in response to your point.
It doesn’t matter if virality is the goal, unless you’re suggesting it be actively prevented, virality is just a natural phenomenon of the internet. The term viral generally implies uncontrolled exponential spread. To this day, stuff goes viral without people intending it to.
And if you architect the system to scale a p2p network proportional to virality (ex. as people share it, they also self-host) you run into a ton of security and abuse challenges. We’re also stretching the definition of “self-hosting” at this point.
I’m all for it. All publicity is good publicity in this space. Open criticism is the first step to better open software.