This isn’t a basic copy of a whole file. This is creating a new file from a portion of an existing file.
This isn’t a basic copy of a whole file. This is creating a new file from a portion of an existing file.
Are there downsides to using reflinks, like alignment and read performance issues?
Compatibility shouldn’t be an issue, ie it should be relatively simple and safe to have a fallback that copy when reflink isn’t available.
It would be fair to compare browsers without adding extensions, with default settings.
This would show which browser have the best security and privacy out of the box. Also, the comparison would be practically impossible otherwise.
Most people use defaults, and I suspect a large portion of users install no extension, unless maybe if a tech-savy relative adds an adblocker.
This is precisely why these people ask, among other things, to strictly restrict access to adults.
LLM are good with language and can be very convincing, especially to children and teenagers, who don’t fully understand how these things work, and who are more vulnerable emotionally.
That’s a good point, but there’s more to this story than a gunshot.
The lawsuit alleges amongst other things this the chatbots are posing are licensed therapist, as real persons, and caused a minor to suffer mental anguish.
A court may consider these accusations and whether the company has any responsibility on everything that happened up to the child’s death, regarless of whether they find the company responsible for the death itself or not.
He gave a reason, and said he’s not going to answers why questions, so your guess is as good as anyone else’s.
We should be thankful that this person maintained the app and put up with Google’s bullshit for so long.
If you find this app helpful, consider supporting whoever is willing to take over maintaining the app or a fork.
Thanks for the interesting details. Glad to see there’s an offline version that disables photogrammetry.
The church in england is a good example where a a generic rectangle building model doesn’t work. They could improve the offline version by adding a church model in the set of offline models, and use it for 90% of church in western Europe.
A fully realistic model of every single building may be cool for architects, future historians, city planners, gamers that are sightseeing… but don’t help much when learning to pilot. Having a virtual world that look similar to the real one, with buildings of the right size and positions, landmarks, and hero buildings is good enough, and doesn’t require that much resources. There are others parts of flight simulators that are more important to work on.
I happen to know a bit about game and simulators. From a plane’s point of view, houses dont look unique. A small number of models is enough to fairly represent most houses. There may be a minority of structures that are really unique (stadiums, bridges, landmarks, …) but the vast majority of buildings aren’t unique. Even if two building have different heights, it’s possible to reuse textures if they’re built from the same material.
MSFT appears to have designed the simulator by considering every building is unique, but if they compared buildings and textures, ideally using automation, they would see there’s a massive amount of duplication.
I’m not suggesting putting the whole world on a 120GB disk.
That being said, most of the textures and building geometries used for San Andreas may be reused for other cities in the west coast. Areas between cities that have a lower density could take much less space.
So doubling the physical area covered doesn’t necessarily require doubling the amount of data. But the bandwidth usage from MSFT’s simulator suggest they are not reusing data when they could be.
GTA 5 require 120GB of disk size, not 500GB. And this include everything, game engine, assets, and the whole area. https://support.rockstargames.com/articles/203428177/Grand-Theft-Auto-V-PC-system-requirements
Because everything has to fit on the average game PC or console storage, they have some pressure to optimize data size. A simulator that streams everything have less constraints on data size, less motivation to keep size reasonable.
This shows they’re not trying very hard to optimize the simulator, but instead throw hardware and bandwidth at it, and expect users do the same.
Open world games like GTA allow flying over dense areas without using 180Mbps of bandwidth.
Testing infrastructure would help for sure, but it’s not necessarily the lack of infra that’s causing trouble.
Linus complains the author didn’t submit the patch to some places for public comments and testing BEFORE requesting a merge.
It sounds like he expects something like
Here’s a mailing list thread asking for feedback and testing. No one complained in a week, could you merge ?
I hope Gimp 3.0 stable will happen before the heat-death of the universe.
Hovering over a checkmark will display a message that explains “Google’s signals suggest that this business is the business that it says it is,” which is determined by things like
I guess this due diligence cost time and money. And doing this due diligence for every ad customer might affect their bottom line.
- You have a malicious actor on your trusted network.
- If so, you have bigger problems.
This is more likely than you think. There’s more computers than you realise on the average network. Many aren’t updated and have vulnerabilities. If there’s one malware on one machine on your network, that means a malicious actor is on your network.
Common exemples :
I guess integration with Google Drive is a big convenience for users.
But yes, if the cost of getting access is to high for indie developers, then it make sense to avoid Google Drive. Creating and maintaining your own cloud sync service for a specific app may not be worth it, they should investigate integration with existing Google Drive competitors/alternatives.
Translation: We’re extremely short staffed, so we are shaming our employees into sacrificing their vacation
Knowledge of the account is an obvious caveat. Yubikey-based MFA is an added layer of protection for accounts, so any kind of attack against MFA assumes the attacker already knows which account to target.
It’s like saying “our door lock is flawed, but the attacker would need to have knowledge of the door”.
The cost and complexity is what’s noteworthy and is more relevant. Although attack cost and complexity usuallu goes down with advances in tooling and research. So it may be a good idea to plan a progressive retirement of affected keys.
While that’s true, but there’s no indication of Microsoft brute forcing with million of combinations.
The article you link says Microsoft is only trying a few obvious passwords: the filename, and words found in the plaintext message.
Proper encryption isn’t just about using a strong algorithm. It’s also about proper key management, ie not sending the password in the clear via the same channel as the encrypted files.
Telling your contacts not to use Google nor Meta/Facebook. If everyone you email use gmail, then Google has all your emails.