• 2 Posts
  • 61 Comments
Joined 3 years ago
cake
Cake day: July 1st, 2023

help-circle
  • That’s a tough one. Those small points hanging ledges pose a lot of problems for printers and petg is not a forgiving filament type.

    As others have said, petg can be a harder filament to print. Even dry it tends to be more viscous leading to oozing, stringing. I’m not convinced that’s the problem but it could be part of it as build up from stinging or over extrusion can cause collisions leading to something like this.

    The damage looks like it’s happening on one side. That hints at either a cooling problem or some movement or seam placement problem.

    Looking at pictures of your printer it looks like it has too fans so I suspect that side had direct cooling and the openness means it’s probably not getting a wall that would affect it.

    Related to movement, speed/acceleration could be an issue. You might have heard scratching while printing in this area. If so slight warping during cooling or from over extrusion could lead the nuzzle colliding. On a more solid print you could probably get away with ignoring it as it wouldn’t affect the print but with such small parts small impacts over time will lead to knocking parts off or distorting them. Try slowing down the print. Most of the print here is delecate but you can do that in modifiers if you want other parts of the print to be fast.

    Not sure how much that adds or helps but good luck.


  • I guess sort of.

    I always saw them as cited as ad ridden and the complexity of recipe copyright (you can copyright a story but not a recipe).

    But I guess there’s a convergence in that the Google ad ecosystem relied on SEO nonsense and the quality varied pretty widely so some sites where just aggregated bad recipes optimised to get ad views.

    There where enough real sites and the bad ones where easy enough to sniff out that it seemed a reasonable compromise at the time.


  • For me it’s been such a night and day experience it’s hard to imagine needing to explain why Wayland has been better. But I’ll try.

    The big thing that got me to switch was actual multi-monitor support. X has a bunch of hacks that “work” but it’s a mess and constantly broke for me. I’d just randomly log in and it was broken and I’d spend a day in xrand a x11 conf files re-building it from scratch for no apparent reason. Wayland multi-monitor has just worked for years now. It’s also real mutlidisplay support and really quite good.

    Ive seen complaints about Nvidia but even with them dragging their heels I’ve had a better experience with their drivers on Wayland. Probably tied again to multi monitor bit it’s just been smoother and I notice if I accidentally log in to an X session even on a single monitor setup because things are clunky and features missing.

    Anecdotally DEs feel like they start faster and work smoother. I saw fewer crashes after switching as well. The crashing might be better these days then but I don’t see a reason to test it.

    For the sake transparency, it’s not perfect. Compatibility really has been great and I struggle to tell what’s not native. But I mean this is Linux desktop and there are challenges regardless of your choices.

    I enjoyed guake terminal. It’s a bit troublesome to make work well.

    The one other thing that’s been troublesome is some screen capture stuff. Honestly the screen sharing in Wayland lovely and so much better when it work.

    But some programs do their own thing and want full desktop control and that’s a struggle. For example moonlight/sunshine require what seems to be some extra tinkering. Similarly screen collaboration apps that try to do the full control thing tend to not work well or at all.



  • I kind of understand why someone would honestly. Jellyfin subtitles are still a hot mess for a lot of formats unfortunately. Also, while plex has tried really hard to ruin their UI, I’ve still had more trouble explaining where to find things in Jellyfin. And if you’re sharing your collection with friends or family members there’s a lot more technical stuff involved.

    So I can see why the balance might still tip toward paying plex still for some people.

    Luckily I bought a lifetime license ages ago before the first price hike so this doesn’t affect me yet. So I’m just riding out the decline, running them in parallel until plex completely breaks. slowly transitioning the family as they get annoyed with broken features. Plexamp is quickly taking care of that 😅




  • I miss really digging deep into what my system was doing and understanding how the different components worked. I had choices at every step and owned every package, feature, and configuration. Also being able to easily patch and collaborate on fixes with maintainers through a local overlay.

    I also feel like that understanding provided a knowledge of dark magics of how and why distros work forged in the mistakes of my Gentoo systems that’s been valuable in my career.

    That said, I don’t really have time for it these days. Being able to just turn my computer on and it just works with a mainstream binary distro is a stability I’ve needed for things like work and home servers for family stuff.

    Some people aren’t patient with you needing to entirely rebuild your system because you broke an ebuild or didn’t read a news and it trashed your system and it’s got several hours of recompiling system packages ahead of it.

    That said I’ll perpetuate the trope and say I broke down and finally started running Arch on some personal machines this year and enjoy it. It’s not the same but it’s filled a bit of that itch and is fun to push the edge and find other people doing the same.



  • neclimdul@lemmy.worldtoLinux@lemmy.mlWhy?
    link
    fedilink
    arrow-up
    5
    ·
    4 months ago

    It was a challenge I wanted to conquer too but also I increasingly felt like I didn’t own my computer. The software was increasingly cutting me out of the ability to modify and use it the way I wanted.

    I spent a lot of time in Gentoo early on where patching software was an overlay and recompile away and it was great testing early amd64 bugs and pushing the limits with gaim and reverse engineering chat protocols.

    I was doing some dual booting then but as i built a career in web development, it became more and more my solo driver. Running the same platform you’re developing for is incredibly convenient and Linux runs the web.

    Now I can’t imagine running windows. Using it and helping people on it is just a miserable experience for me.




  • Been using cosmic ui on and off since the beta release and it is still pretty beta. Really good at this point honestly and a huge achievement for them but not without some annoying bugs for me.

    Just something to consider before jumping. You should be ready to work around some annoyances, deal with some slowness/quirks, and probably be ready to provide feedback and bug reports.








  • It doesn’t have to not include JavaScript, that would be quite difficult and unreasonable. Accessible sites are not about limiting functionality but providing the same functionality.

    I haven’t gone fully down the rabbit hole on this but my understanding is even something like Nuxt if you follow best practices will deliver HTML that can be interacted with and serve individual pages.

    That said, screen readers and other support shouldn’t require running without any JavaScript. Having used them to test sites that might be the smart approach but they actually have a lot of tools for announcing dynamic website changes that are built into ARIA properties at the HTML level so very flexible. There are of course also JavaScript APIs for announcing changes.

    They just require additional effort and forethought to implement and can be buggy if you do really weird things.