• 1 Post
  • 1.22K Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle



  • ImplyingImplications@lemmy.catoCanada@lemmy.caLove to see it
    link
    fedilink
    arrow-up
    78
    ·
    edit-2
    19 hours ago

    To give a bit of context. The person in second, Pierre Poilievre, is the leader of the Conservative party and campaigned for Prime Minister of Canada. Canadians don’t vote directly for Prime Minister. The country is divided into a few hundred ridings and each riding gets to elect a Member of Parliament. The party with the most MPs gets to form government, and their leader becomes the Prime Minister.

    Not only has Poilievre failed to win enough seats for the Conservative party to form government, he might not even win his own seat. A seat he has held for 20 years. It would be embarrassing for him and hilarious for all the Canadians that think he’s a dickhead.






  • I think it’s a topic that just doesn’t interests most people, especially children. Where I live, solving problems like 10 - x = 4, solve for x is taught to 10 year olds in grade 5. How many 10 year olds would think this is interesting?

    In comparison, grade 5 science teaches cells are the building block of life, energy can exist in forms like electrical and light and can transform between them, and matter has states like solid, liquid, and gas. It’s stuff that ends up being naturally more interesting.







  • It could hallucinate a citation that never even existed as a fictional case

    That’s what happened in this case reviewed by Legal Eagle.

    The lawyer provided a brief that cited cases that the judge could not find. The judge requested paper copies of the cases and that’s when the lawyer handed over some dubious documents. The judge then called the lawyer into the court to ask why he submitted fraudulent cases and why he shouldn’t have his law licence revoked. The lawyer fessed up that he asked ChatGPT to write the brief and didn’t check the citations. When the judge asked for the cases, the lawyer went back to ask ChatGPT for them, and it generated the cases…but they were clearly not real. So much so that the defendants names would change throughout the case, the judges who ruled on the cases were from different districts, and they were all about a page long when real case rulings tend to be dozens of pages.



  • You could probably just say “thank you” over and over. Neural networks aren’t traditional programs that exit early for trivial inputs. If you get a traditional program to sort a list, the first thing it’ll do is check to see if the input is already sorted and exit if it is. The first thing AI does is convert the list into starting values for variables in a giant equation with billions of variables. Getting an answer requires calculating the entire thing.

    Maybe these larger models have some preprocessing of inputs by a traditional program to filter stuff, but seeing as they all seem to need a nuclear power plant and 10,000 GPUs to run, I’m guessing there isn’t much optimization.