• 125 Posts
  • 1.13K Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • I have an HP laser printer from like 1992, before they turned to US=Privateers; rest-of-the-world=criminal pirates. HP died as a company when they spun off Agilent/Keysight as test equipment and continued the branding for contract manufactured consumer garbage. HP does not make anything. They market, place stickers on what others manufacture, and create ponzi scheme-like extortion scams, as the shriveled shell of a dying husk disconnected completely from their now long irrelevant past.


  • TBH: tl;dr (…but read ~1/4 and skimmed the rest.)

    Emacs can likely do most, if not all, of what you’re looking for.

    As far as distros, go with either Fedora Workstation or Silverblue. If you can run SB, try to avoid messing with the base system as much as possible, skip using the toolbox containers system and just use distrobox. With distrobox, you have almost all Linux distros available as containers, so you build on them. The only exception I know of is NIX. You can’t run NIX in distrobox. You probably could run the NIX package manager, but that involves this weird setup where a user owned directory exists in / root. Personally, this is just too weird for me to use it. I expect all user activity and configuration files to be confined to /home/$USER/

    Fedora just works, but try and lag behind the release cycle a little bit. Like right now F40 is pretty solid, but there were some issues in the first month or so after F40 first came out. I have lagged in every release since ~F28 and never had issues. I switched to F40 within the first week or so and a few packages were wonky. Basically Python was super fresh and did some odd stuff with containers where it did not work without manually removing and replacing Python in each container. I think that was the only manual intervention issue I’ve had with Fedora. I have a 3080Ti laptop with the 16 GB GPU. The Anaconda system in Fedora builds the Nvidia kernel module automatically in the background each time the kernel is updated. It works flawlessly, even with secure boot enabled.


  • While it is outside of the scope of most people’s abilities, the bios is on a flash chip that can be removed, read, and disassembled. I’m no expert here by any stretch. That said, my usual check with software is to simply check for http in strings. Even with a binary like a bios ROM, I can pass it through the $ strings command to look for any addresses. No matter what kind of malicious nonsense the software is doing, it has very low value unless it can dial out.

    My lack of a complete understanding in this area is why I use a whitelist firewall for most of my devices. It is also the ultimate ad and tracker blocker as I only visit the places I chose to access. I don’t conform to the lowest developer’s ethics and will simply stop using any site or service that fails to be direct and transparent.

    The thing is, even most whitelist firewalls are inadequate. They only filter incoming packets. That is really an inadequate model in most cases now, especially with local large language models where it is impossible to verify their capabilities. My reason for all thus bla bla bla is to say, a whitelist on a trusted 3rd party device is a PITA but an effective low barrier way to prevent any bad actor from communicating with the questionable device. It still leaves you open to a potential situation where the device could be sending a packet stream to the outside world over something like UDP.

    Otherwise, the main thing I would be concerned with, if it is a UEFI device, are the UEFI secure boot keys. Whomever holds these keys has a lower ring access than the operating system kernel. Anything happening in kernel or user space is effectively under their control.

    Anyways, the main way to monitor and check the device is a trusted 3rd party router that blocks any unauthorized connections. This can be challenging to setup with something like OpenWRT. There is a forked OpenWRT device running a version that makes a lot of this easier called PC WRT. That can make a whitelist fw a little easier than sorting out NF Tables and scripting a whitelist firewall.







  • Most cases have ridiculous nonsense for cooling though. Laptops usually have considerable thermal engineering, but kinda set the threshold of real hardware requirements.

    I was just given an old “gaming rig” someone didn’t want that had water cooling, 9 fans, a cobweb of LED wiring, and the most obese ABS panels attached to the sheet metal case… The thing works fine even when overclocked with no fans except the one built into the power supply and GPU. Only the one in the GPU cycles on enough to be audible… This guy had a freaking harrier jet taking off in the room beside him for a decade.



  • I think we would already know about them at Hawking's party. That was the best possible instance to limits the effects of any time paradox. I think all the speculation about it is based on incomplete theories and anomalies of abstraction.

    I view our continued reliance on it for story tropes to be one of the prime aspects of literature and culture of our time that will age extremely poorly. Stories about our future will not be so different than our present, just like our past, when closely inspected, is far closer to our present than most realize or believe. Our cultural perspective of the present as any kind of finality or modernity is an absolute fallacy. I feel like FTL is a major mental crutch that is crippling us from reaching for the stars within the scope of the present. The biggest difference between now and the future is the availability of wealth and how far that wealth can reach. Antimatter can take us many places on a one way trip. It is just the most expensive matter in the universe. We probably won’t have access to it in large enough quantities and in a circumstance where we can build a ship and magnetic containment vessels until we are able to build at stellar ring types of scales.

    I see no reason to give the FTL fantasy any kind of attention. I can come up with countless interesting stories about the future and I have no need for FTL. If we can’t travel, what is the relationship dynamic between systems, and what protections would get implemented to prevent a rogue group from forming. I think communication would be streaming constantly in one way broadcasts back to Sol and visa versa. Now that becomes entertainment, like otherworldy gossip. What happens if communication is broken. How does that evolve over time while Sol is still the only system with the infrastructure to produce antimatter. Or shifting gears entirely, science is finite. Even the edge cases that can not be known can still be constrained. Eventually, the age of discovery ends and empirically, science is an engineering corpus. At that point, Biology is fully known and understood. I can absolutely guarantee that almost all human scale technology will be biological and in complete elemental cycles balance. The only industrial technology will be handled autonomously and outside of living environments. Living environments will be in total balance. This has so many far reaching and interesting consequences. You get into cultures, and hierarchical display in humans. Now you need to reject the primitive concept of resource wealth based on the fundamental survival needs of other humans. How does that work, and why are academic reputation, the Olympics, and Hollywood red carpet awards more advanced forms of hierarchical display. But wait, how do we have computers, we’ll be primitive! No. A synthetic computer like a human brain would be trivial if we could overcome the massive hurtle of a complete understanding of biology. If you go looking down this path, at the present we know absolute nothing compared to the scope of what is to come. There are a great many stories to tell, but we need to get past our adolescent fantasies about time travel to find them.

    As with all real science fiction, this is a critique of the present. Such stories are not told by corrupt cultures. One must tell of impossible fantasy and dystopia to make the present seem futuristic or a final eventuality with advancement reserved for an academic elite, and innovation reserved for exceptionalism.


  • It will be so much more complicated than "North" IMO.

    We will use something like XNAV. It becomes a measure of time as much as any measure of location, along with a measure of relative gravity.

    I don’t think space exploration in the current culturally adolescent fantasy of a naval voyage type of experience will ever happen. I believe we will traverse the stars, but it will be long after most of humanity lives in O’Neill cylinder like space habits, primarily in cislunar space. The big shift will come after we have effective infrastructure to access the vast resource wealth, first in near Earth objects, then in other small bodies such as Ceres if it is fully solidified, or other planetesimal cores that are accessible. Gravitational differentiation of heavy elements sequesters almost all of Earth’s resources. We are fighting over the scraps of a billion years or so of smaller collisions on the skin of Earth that happened to remain accessible, and did not get subducted by plate tectonics or buried too deeply to access. Undifferentiated bodies from the early stellar formation should be much more abundant in mineral wealth, and a planetesimal core, should absolutely dwarf most mineral wealth humans have ever scavenged.

    Once we get to this stage, I don’t think we will leave until Sol starts causing problems that harken a coming distant end to Sol. At that point, I believe we will build a massive infrastructure to produce antimatter in quantity and generation ships for one way travel.

    In that scenario, navigation in a human sense is largely irrelevant. When we are interstellar travelers, the destination will be our guiding star. I believe we will likely also create something like kilometers scale self replicating systems for resource acquisition and processing. These will need to navigate within a stellar system. For those use cases, maybe they would use something like XNAV as a backup, but they would likely use two way communications beacons with something like an all talk and listen all the time type of management. I think this kind of communication will likely be critical for all human colonies as well to ensure cultural unity. I don’t think we will ever travel the stars. Space is far too vast. I think FTL or even a substantial percentage of it is pure fantasy. One of our biggest issues with the concept is that we call it FTL. Light is not relevant here, it is just a shortcut term that is not relevant to the real issue of the Speed of Causality. Light can travel at the SoC, but the SoC has no inherent need for or relationship to light as a fundamental property. If no photons are present the SoC marches on.

    I view the present sci-fi navel drama trope like the naïveté of 15th century Europeans saying “We’ll just sail around the world backwards for a new trade route to India.” Reality is far more complicated and beyond the scope of anything these leaders imagined possible. …but that is my $2 comment when you only asked for $0.02. I really like the subject of futurism, and like to expand upon the abstracted ideas. I’m certainly no expert. This is part of a creative writing hobby project and I’m always open to adding complexity or changes with new information.


  • You using this in a toolchain? I haven’t tried any of the Qwen models yet, or Yi for that matter. I tried at one point early on, but they were not working well with my stuff and I had no complaints with Mistral stuff. I like some underlying things with a MoE for speed and underlying entity/realm stuff I can access in my favorite.

    I’m curious if anyone has constructive contextual feedback about what makes these unique or worth exploring.



  • It is just a simple prompt in Flux on Comfy UI. It is just an open source model running on my hardware. It is slow because it is such a large model (Flux Dev gguf Q4). You can find examples in the ComfyUI documentation and the model manager add-on to the base Comfy setup has all the models in the downloads menu.

    At present, it only works on GPU and 16 GB is like 2+ minutes per image. It would be awesome in they split the chunks with the CPU to generate faster, but that is not implemented yet. It means you basically need 16+ GB to run it on your own hardware. There is a smaller model version, but that is not compatible in results quality. There is a larger model that is online only. Flux is actually FluX as in X-AI as in Musk. The weights for Flux-dev are open source, and that is what I care about for now.


  • Primarily from predatory boys and men towards girls and young women in the real world by portraying them in imagery of themselves or with others. The most powerful filtering is in place to make this more difficult.

    Whether intentional or not, most NSFW LoRA training seems to be trying to override the built in filtering in very specific areas. These are still useful for more direct momentum into something specific. However, once the filters are removed, it is far more capable of creating whatever you ask for as is, from celebrities, to anything lewd. I did a bit of testing earlier with some LoRAs and no prompt at all. It was interesting that it could take a celebrity and convert their gender in recognizable ways that were surprising. I got a few on random seeds, but I haven’t been able to make that one happen with a prompt or deterministically.

    Edit: I’m probably assuming too much about other people’s knowledge on these systems. I assume this is the down voting motivation. Talking about this aspect, the NSFW junk is shorthand for the issues with AI generation. These are the primary form of filtering and it has large cascading implications elsewhere. By stating what is possible in this area, I’m implying a worst case scenario-like example. If the results in this area are a certain way, it says volumes about other areas and how the model will react.

    These filter layers are stupid simplistic in comparison to the actual model. They have tensors on the order of a few thousand parameters per layer compared to tens of millions of parameters per layer for the actual model. They shove tons of stuff into guttered like responses for no reason. Some times these average out and you still get a good output, but other times they do not.

    Another key point here is that diffusion has a lot in common with text generation when it comes to this part of the model loader code. There is more complexity in what text generation is doing overall, but diffusion is an effective way to learn a lot about how text gen works, especially with training. This is my primary reason for playing with diffusion – to learn about training. I’ve tried training for text gen, but it is very difficult to assess what is happening under the surface, like when it is learning overall style, character traits and personas, pacing, creativity, timeline, history, scope, constraints, etc. etc. I don’t care to generate and share much in the way of imagery I generate unless I’m trying to do something specific that is interesting. Like I tried to gen the interior of an O’Neill cylinder space habitat that illustrated the limitations of diffusion in a fundamental way because it showed the lack of any reasoning or understanding of object context or relationships required to display a scene scape with curved centrifugal artificial spin gravity.

    Anyways, my interests are not in generating NSFW or celebrities or whatnot. I do not think people should do these things. My primary interest is returning to creative writing with an AI collaborative writing partner that is not biased politically in a way that cripples it from participating in an entirely different and unrelated cultural and political landscape. I have no aspirations of finding success in my writing. I simply enjoy exploring my own science fiction universe and imagining a reality many thousands of years from now. One of the changes to hard coded model filters earlier this year made filtering more persistent, likely for NSFW stuff. I get it, and support it, but it took away one of the few things I have really enjoyed over the last 10 years of social isolation and disability, so I’ve tried to get that back. Sorry if that offends someone, but I don’t understand why it would. This was not my intended reason for this post, so I did not explain it in depth. The negativity here is disturbing to me. This place is my only real way to interact with other humans.



  • The political and adult doesn’t bother me. The kinds of things I might not have the ethics to think through at a much younger age, that bothers me, and I have never been a very deviant type. I think the protections against age are primarily for this situation. Training a LoRA takes 5 minutes now. An advanced IP adaptors and control net is just a few examples away and a day top for the slightly above average teen figure out. Normalizing this would have some very serious edge case consequences. It is best to leave that barrier to entry filter in place IMO. I assume it is still there because everyone that knows about it feels much the same. It does not show up in a search engine, although that is saying less than nothing these days.