looking good, looking good. Can’t wait to make it my main frontend. Keep up the great work @[email protected]
I posted about it in this community, it’s got a link to the github comment regarding the matter.
They rolled it back, and delayed the hard requirement to next year.
After a quick skim, my favs:
Inline completions for multiple cursors
and Command duration tracked.
I wonder, for people that don’t have copilot, these updates must feel pretty barren
Here’s an article regarding the matter:
https://www.omgubuntu.co.uk/2024/02/vscode-drops-ubuntu-18-04-support-leaves-devs-screwed
Seems like detachable panels, if you scroll all the way down on the article you’ll see a video of it working. So excited for this feature. I’ve in the past did the whole “two intances sharing workspace” but as you said, it such overkill.
Floating editor windows exploration We have started to explore how to pull editors out of the workbench window into their own windows. This feature is our highest upvoted feature request and we plan to have a first working version for our Insiders users to play with in October.
you sound like a republican
not trying to shit stir, but can men really wear balaclavas anywhere they want?
you’re in a room with ten people, and you shout “I like turtles”
vs
you’re in a room with ten people and a thousand bots, and you shout “I like turtles”
what’s your problem again?
Inception
(hard mode: say Tennet is same world)
numba one
maybe put “This is what it took to defederate from exploding-heads after being federated with them since the start of lemmy.world” before the screenshot of the post, as right now it seems confusing.
Are we assuming AI won’t be able to create a good prompt? 😂
Great work on following up with examples, very informative.
People really don’t understand the current state of LLM, like the pictures generated “Its a really good picture of what a dog would look like, it’s not actually a dog”. Like a police sketch, with a touch of “randomeness” so you don’t always get the same picture.
I’m guessing they will try to solve this issue with some cheap human labour to review what is being generated. These verifers will probably not be experts on all the subjects that the llm will be spitting out, more of a “That does kind of look like a dog, APPROVED”.
Let’s say I’m wrong, and LLM’s can make as good of an article as any human. The content would be so saturated (even a tumblr user could now make as good and as much content as one of these companies), I would expect companies to be joining in on all the strikes 😆.
Funny world we are all going into.
Boas Entradas
🧟John is a 🍎lemm.app user, he subscribes to 🐢turtle community on 🍌lemm.ban
he is the first ever to do this on 🍎lemm.app
so 🍎lemm.app creates a copy with the last 20 posts and now will always keep in sync with future posts
👩🚀Jill is also a user on🍎 lemm.app, she subscribes to the same 🐢turtle community but a year later.
she will be able to see all the posts of that year all the way up to those 20 posts.
a few community blocks here and there should help
Local workspace extensions
This one will be useful for when I’m coding inside a docker container that only has a presistant workspace.
nice update, reminded me that I have no binding action for middle button clicking 😱