Pets are usually provided with everything they ever need by their owners so they have little reason to show their brutal side.
Pets are usually provided with everything they ever need by their owners so they have little reason to show their brutal side.
Still a strange take IMO.
Bad individual human behaviour often pales into insignificance when contrasted with wild animal behaviour.
Unless you count the ongoing ecocide we as a collective are in the midst of enacting…
As a massive fan of his older work, I was disappointed to see he has since turned his hand to creating transphobic jokes in his more recent material.
Also find it a bit weird that he consistently proclaims how much he loves animals more than most humans.
Chuck some cheese sauce on that cauliflower, add some stuffing and we’re getting there…
Suddenly this hummus I’m eating for lunch doesn’t quite cut the mustard. Actually on that note, include some mustard in the cauliflower cheese pls.
Oblivious fact: Me
Fun fact: Roasting meat alone does not a Sunday roast make.
Stops carving the Sunday roast and holds off putting the apple crumble in the oven…
But we are one of the most multicultural societies in the world and have long since adopted everyone else’s cuisines.
By this logic the Japanese don’t have curries and the Americans don’t have pizza, or any other food for that matter.
Yeah never got this. The nation’s favourite dish is curry. My favourite dish is curry. Isn’t it a running joke amongst Indians how much the Brits love curry?
Things like beans on toast and fish finger sandwiches are cheap and easy lunch snacks for students but not our actual diet.
Oh yeah agree, he has the reverse Midas touch at the mo. Might add it coincided with the media flipping sides when they saw it was a foregone conclusion.
One of the admittedly minor things I dislike about his tetchy interview answers is when he starts going “blah blah blah…that’s why we’ve done things such as…” and then proceeds to list the one and only example of said thing.
Nowadays?!
Cavity protection ain’t gonna cut it where they’re going.
🇺🇸 “Baadel a waader” 🇺🇸
✅ Math is hard
❌ This math is hard
Yep my sentiment entirely.
I had actually written a couple more paragraphs using weather models as an analogy akin to your quartz crystal example but deleted them to shorten my wall of text…
We have built up models which can predict what might happen to particular weather patterns over the next few days to a fair degree of accuracy. However, to get a 100% conclusive model we’d have to have information about every molecule in the atmosphere, which is just not practical when we have a good enough models to have an idea what is going on.
The same is true for any system of sufficient complexity.
This article, along with others covering the topic, seem to foster an air of mystery about machine learning which I find quite offputting.
Known as generalization, this is one of the most fundamental ideas in machine learning—and its greatest puzzle. Models learn to do a task—spot faces, translate sentences, avoid pedestrians—by training with a specific set of examples. Yet they can generalize, learning to do that task with examples they have not seen before.
Sounds a lot like Category Theory to me which is all about abstracting rules as far as possible to form associations between concepts. This would explain other phenomena discussed in the article.
Like, why can they learn language? I think this is very mysterious.
Potentially because language structures can be encoded as categories. Any possible concept including the whole of mathematics can be encoded as relationships between objects in Category Theory. For more info see this excellent video.
He thinks there could be a hidden mathematical pattern in language that large language models somehow come to exploit: “Pure speculation but why not?”
Sound familiar?
models could seemingly fail to learn a task and then all of a sudden just get it, as if a lightbulb had switched on.
Maybe there is a threshold probability of a positied association being correct and after enough iterations, the model flipped it to “true”.
I’d prefer articles to discuss the underlying workings, even if speculative like the above, rather than perpetuating the “It’s magic, no one knows.” narrative. Too many people (especially here on Lemmy it has to be said) pick that up and run with it rather than thinking critically about the topic and formulating their own hypotheses.
And again…
You’ve just copied my arguments yet again.
Seek help, your projections are concerning.
You don’t really have one lol. You’ve read too many pop-sci articles from AI proponents and haven’t understood any of the underlying tech.
All your retorts boil down to copying my arguments because you seem to be incapable of original thought. Therefore it’s not surprising you believe neural networks are approaching sentience and consider imitation to be the same as intelligence.
You seem to think there’s something mystical about neural networks but there is not, just layers of complexity that are difficult for humans to unpick.
You argue like a religious zealot or Trump supporter because at this point it seems you don’t understand basic logic or how the scientific method works.
I hadn’t really noticed this, perhaps in hindsight you’re correct. Can you be more specific?
Gareth in The Office was pervy/racist who pretty much needed the abuse to prevent him making other’s lives hell.
With Maggie in Extras I saw it as a best friend banter type thing. The piss taking was done in private and came across in jest.
If you’re referring to Karl Pilkington, I think he new what he signed up for, was playing an exaggerated version of himself for comedic effect and made a lot of money in doing so.
When Derek came out I couldn’t get past the obvious, pretty much where I stopped following his material.