don’t forget the additional ssl cert for the second domain (assuming it’s not a wldcardable subdomain)
Just some IT guy
don’t forget the additional ssl cert for the second domain (assuming it’s not a wldcardable subdomain)
the problem is less how often I have to charge my mouse and rather that I’m almost always using it and on top forgetting to charge overnight so there is no “convenient” time to charge it during the day where I’d not use it
given how often my mouse has to charge while I’m using it I’d have a major issue with it
uuuuuuuuuuuuu
That certainly didn’t help either
Seems like a decision for the better, so long as they have the name Mine attached to it there will be higher expectations and lower initial trust than with an entirely different name
if water makes other things wet then most water is wet because it (usually) is surrounded by more water. qed
Last I checked (which was a while ago) “AI” still can’t pass the most basic of tasks such as “show me a blank image”/“show me a pure white image”. the LLM will output the most intense fever dream possible but never a simple rectangle filled with #fff coded pixels. I’m willing to debate the potentials of AI again once they manage to do that without those “benchmarks” getting special attention in the training data.
in my experience that usually boils down to the Linux version getting (an understandably) low priority compared to the windows version. Which is something that won’t change until Linux has a significant market share and I can’t fault the devs for it. For an example of a game where that is not the case I’d like to take Factorio, where the devs went out of their way to properly work on the Linux native version and even include Linux/Mac exclusive features for their next update. But that only happened because one of the devs themselves uses Linux.
You apparently have little interaction with regular users because one of the top problems a non-power user has is “oops I accidentally hit delete on this important file I don’t have a backup of”.
Not saying qbittorrent-nox of all things switching makes a ton of sense but at least for desktop applications there is a very good reason why deleting things becomes a two step process.
Excuselme they want how much for what boils down to a graphics upgrade?! Damn that is insanely greedy and definitely won’t affect sales negatively
Really screws with you when you miss multiple 70+% rolls in a row
I’d argue in the states you can lick those you really shouldn’t because it will instantly freeze your tongue off
A) funny how that works with Steam everytime. “We don’t need Steam” > sales plummet > “Release on Steam in 60 days”
B) I don’t think releasing the game on Steam will save them here, from what I’ve seen it’s just a bad game plain and simple. It will maybe fill the gap a bit but probably not by enough to actually achieve the sales numbers they would like to see.
I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I’ve got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don’t play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.
Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.
Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you’re dealing with large amounts of data so I also don’t really understand why people recommend against 10G here of all places.
Yeah they definitely could have been quicker with the patches but as long as the patches come out before the articles they are above average with how they handle CVE’s, way too many companies out there just not giving a shit whatsoever.
If I buy a switch and that thing decides to give me downtime in order to auto update I can tell you what lands on my blacklist. Auto-Updates absoultely increase security but there are certain use cases where they are more of a hindrance than a feature, want proof? Not even Cisco does Auto-Update by default (from what I’ve managed to find in this short time neither does TrendNet which you’ve been speaking well of). The device on its own deciding to just fuck off and pull down your network is not in any way a feature their customers would want. If you don’t want the (slight) maintenance load that comes with an active switch do not get one, get a passive one instead.
So first of all I see no point in sharing multiple articles that contain the same copy-pasted info, one of those would have been enough. That aside, again, patches were made available before the vulnerability was published and things like MikroTik not pushing Updates being arguably more of a feature since automatic updates cause network downtime via a reboot and that would be somewhat problematic for networking equipment. Could they have handled that better? Yes, you can almost always handle vulnerabilities better but their handling of it was not so eggregious as to warrant completely avoiding them in the future.
always nice to see a little insight into how Holo operates. A bit surprised that Yagoo sees potential for improvement on the EN manager side but then again they have to stretch the managers there over pretty much every timezone so I guess it makes sense