• 0 Posts
  • 40 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle
  • I understand not liking Apple, but my point was more that x86, even good x86, is still literally hot trash if you want anything resembling modern performance.

    I really hope that someone steps up with ARM-based laptops that can natively run Linux (because screw Microsoft and the shitty ARM stuff they’ve done to date) and that they ship at a reasonable price and with sufficient performance. Until then, the sole vendor that can provide cool-running, silent, high-performance ARM with 15ish hours of battery life is… Apple.


  • No, not really: even at idle the fans are still moving air, and the laptop is warm enough that you can notice it. You CAN force them off, but then you’ve got a laptop that gets unbearably hot pretty quickly, so that’s not really a workable tradeoff.

    I’ve honestly just kinda given up and use the M1 for everything because it literally never gets warm, and never makes a single sound unless I do something that uses 100% CPU for an extended period of time.


  • Windows task manager is a poor indicator of actual clock speed for a number of reasons, one of which is that it’s going to report the highest clock speed and not the lowest one, which in highly multi-core CPUs isn’t really representative of what the CPU is actually doing. Looking at individual core clocks and power usage is more indicative of what’s actually happening.

    That said, I’ve had pretty bad luck with x86 laptops with the higher-end CPUs; even if you get them to fantastic power usage they’re still… not amazing. I managed to tweak my G14 into using about 10w at idle, which sounds great, until you look at my M1 Macbook which idles under 3w.

    If thermals are really a concern, you may want to look at the low voltage variants, and not the high performance, though that’s a tradeoff all on it’s own.



  • Eh, I wouldn’t go about ‘the self-hosted admins didn’t do anything!’. There never really was a time when the majority (or even a meaningiful minority) of users hosted their own email.

    In the beginning, you got your email address from your school or your ISP, and it changed whenever you left/changed providers, so the initial “free” email came from the likes of Hotmail (which rapidly became Microsoft), Yahoo (which was uh, Yahoo), and offerings from the big ISPs of the era, like AOL and whatnot.

    You still had school and ISP email, but it just rapidly fell out of fashion because your Hotmail/Yahoo/AOL email never changed regardless of what ISP you used or whatever, so it was legitimately a better solution.

    And then Google came along with Gmail and it was so much better than every other offering that they effectively ate the whole damn market by default because all the people who were providing the free webmail at that time didn’t do a damn thing to improve until after Google had already “won”.

    So if you want to be mad, this is firmly Microsoft and Yahoo’s fault for being lazy fucks.





  • I kinda have two answers to this:

    1. Not yet,

    2. It was more an intent to show that they’re not some shining defender of the ad-free private internet, who would never take action to defend a potential future revenue stream if they thought it might be profitable later.

    Remember everyone, corporations are not your friends, your buddy, your pal, or even slightly gives a shit about you beyond how much money they can extract from your wallet and anything that’s in the way of them doing so they’ll work around, stomp on, and kill by any means necessary.



  • They’re not wrong in that most people aren’t suited to or should be running what is effectively public services for other people from some surplus Dell R410 they found on eBay for $40.

    That said, it’s all a matter of degree: I don’t host critical infra for people (password managers, file sharing, etc.) where the data loss is catastrophic, but more things that if it explodes for an afternoon, everyone can just deal with it. I absolutely do not want to be The Guy who lost important data through an oversight on an upgrade or just plain bad luck.

    But, on the other hand, the SLA on my Plex server is ‘if it works, cool, if not I’ll fix it when I can’ and that’s been wildly popular I haven’t had any real issues, because my friends and family aren’t utter dicks about it and overly entitled, but YMMV.

    TL;DR: self-hosting for others is fine, as long as the other people understand that it’s not always going to be incredibly reliable, and you don’t ever present something that puts them at risk of catastrophic loss, unless you’ve got actual experience in providing those service and can do proper backups, HA, and are willing to sacrifice your Friday evening for no money.


  • The closest thing you’re likely to get is a black and white Brother laser.

    It’s as open as a printer is likely to ever be in terms of driver support, the availability of parts is reasonable, and you plug the thing in via USB and then forget it exists until you need to print something.

    I have a 2300D I’ve had for most of a decade now and the only thing I’ve had to do is put paper in it.









  • I think the top 3 reasons are, ultimately, the same reason; the people who are already there don’t want you there, and they like the obscurity of discovery and obfuscation of communication, confusion around instances for onboarding, and ability to gatekeep exactly how you’re allowed to use the platform.

    There’s issues with the underlying platform, for sure, but the established user base likes it the way it is, and is very strongly invested in preventing change.

    And, that’s okay! If you have a platform that you enjoy using, it should be defended, and aggressively.

    But, at the same time, you shouldn’t be utterly confused why so many people either don’t want to or bounce right off your platform and aren’t sticky when it’s pretty obvious (and has been for a while) that the culture is the big driver for it.