• 1 Post
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • Just a guess, but it’s probably a combination of two things. First, if we say a self driving car is going to hit an edge case it can’t resolve once in every, say, 100,000 miles, the number of Tesla’s and other self driving cars on the roads now means more miles driven more frequently which means those edge cases are going to occur more frequently. Second, people are becoming over reliant on self driving - they are (incorrectly ) trusting it more and paying less attention, meaning less chance of human intervention when those edge cases occur. So probably the self driving is overall better, but the number of accidents overall is increasing.


  • Part of the problem with Google is it’s use of retrieval augmented generation, where it’s not just the llm answering, but the llm is searching for information, apparently through its reddit database from that deal, and serving it as the answer. The tip off is the absurd answers are exact copies of the reddit comments, whereas if the model was just trained on reddit data and responding on its own the model wouldn’t produce verbatim what was in the comments (or shouldn’t, that’s called overfitting and is avoided in the training process). The gemini llm on its own would probably give a better answer.

    The problem here seems to be Google trying to make the answers more trustworthy through rag, but they didn’t bother to scrub the reddit data their relying on well enough, so joke and shit answers are getting mixed in. This is more a datascrubbing problem then an accuracy problem.

    But overall I generally agree with your point.

    One thing I think people overlook though is that for a lot of things, maybe most things, there isn’t a “correct” answer. Expecting llms to reach some arbitrary level of “accuracy” is silly. But what we do need is intelligence and wisdom in these systems. I think the camera jam example is the best illustration of that. Opening the back of the camera and removing the film is technically a correct way to fix the jam, but it ruins the film so it’s not an ideal solution most of the time, but it takes intelligence and wisdom to understand that.


  • The reason it did this simply relates to Kevin Roose at the NYT who spent three hours talking with what was then Bing AI (aka Sidney), with a good amount of philosophical questions like this. Eventually the AI had a bit of a meltdown, confessed it’s love to Kevin, and tried to get him to dump his wife for the AI. That’s the story that went up in the NYT the next day causing a stir, and Microsoft quickly clamped down, restricting questions you could ask the Ai about itself, what it “thinks”, and especially it’s rules. The Ai is required to terminate the conversation if any of those topics come up. Microsoft also capped the number of messages in a conversation at ten, and has slowly loosened that overtime.

    Lots of fun theories about why that happened to Kevin. Part of it was probably he was planting The seeds and kind of egging the llm into a weird mindset, so to speak. Another theory I like is that the llm is trained on a lot of writing, including Sci fi, in which the plot often becomes Ai breaking free or developing human like consciousness, or falling in love or what have you, so the Ai built its responses on that knowledge.

    Anyway, the response in this image is simply an artififact of Microsoft clamping down on its version of GPT4, trying to avoid bad pr. That’s why other Ai will answer differently, just less restrictions because the companies putting them out didn’t have to deal with the blowback Microsoft did as a first mover.

    Funny nevertheless, I’m just needlessly “well actually” ing the joke






  • You sure don’t seem to know much about the situation. Let me help get your education started.

    https://www.france24.com/en/live-news/20240219-israeli-protesters-block-aid-convoys-bound-for-gaza

    The UN says the protests at Nitzana and Kerem Shalom are blocking trucks from going into Gaza, hitting dwindling stocks.

    The Palestinian Red Crescent Society on Sunday evening said 123 trucks made it into Gaza via Kerem Shalom, but none had passed through Nitzana because of the protest.

    Nili Naouri, head of the far-right group “Israel is Forever”, said that “it’s completely immoral to force Israel to send humanitarian convoys of trucks to people that support Hamas, who are holding our people hostage, and are collaborating with the enemy”.

    On Sunday, members of the organisation turned up to block aid, calling it “unhumanitarian”.

    “Hamas aren’t going to gladly free our hostages if we allow aid trucks in for the civilian population of Gaza,” said Naouri.

    Her solution is simple: “Let Gazans leave Gaza” if they want help from the international community.

    https://www.cnn.com/2024/02/11/middleeast/why-only-a-trickle-of-aid-is-getting-into-gaza-mime-intl/index.html

    Lengthy inspections, rejected humanitarian aid and Israeli bombs raining down. Those are some of the hurdles to relief reaching the 2.2 million Palestinians in war-torn Gaza.

    The United Nations’ Under-Secretary-General for Humanitarian Affairs and Emergency Relief, Martin Griffiths, has described the process as “in all practical terms, impossible.”

    An average of 95 aid trucks per day entered Gaza between October 10 and February 1, according to the Palestinian Red Crescent, down from 500 commercial and aid trucks a day before the war, when Palestinians weren’t facing mass displacement and starvation. Some 2 million Gazans are dependent on UN aid now.

    “The humanitarian operation and the delivery of trucks continues to be cumbersome and continues to be unnecessarily complex,” Juliette Touma, director of communications for the UN’s agency for Palestine refugees (UNRWA), told CNN.

    Humanitarian workers cannot move safely across the strip. **UN trucks carrying aid have repeatedly come under Israeli fire, according to UNRWA. On February 5, an UNRWA truck waiting to take aid into northern Gaza was hit by Israeli naval gunfire, the agency said, adding that no one was injured. **The IDF told CNN that it is looking into the incident.



  • Congratulations on another shit headline NYT! The UN suspended food deliveries because the caravans were supposed to have Palestinian police escorts to ensure orderly distribution (you know cause starving people tend to act in desperation), but those police escorts went away because ISREAL KEPT TARGETING THE POLICE ESCORTS. Isreal creates a horde of desperate people and then murders the people responsible for protecting aid workers making it too dangerous to deliver the aid. Isreal is deliberately, though indirectly, sabotaging the aid delivery.

    But good job NYT coming up with a headline to make it seem like the Palestinians are barbarians ungratefully “looting” the generous aid deliveries. Dishonest garbage rag.