• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle
  • CDPR is a publicly traded company with a clearly represented shareholder structure. A collective known as “Other” owns about 65% of the company’s shares. CDPR upper management have a fiduciary duty to this entity. This duty was honored when they decided to release early. They knew that the hype train was so intense that whatever they released, it would sell like icy lemonade in the Sahara. It’s not like they didn’t have access to Sony devkits and shit, they knew the performance was sub-optimal. They’re not dumb. They were just OK with temporary backlash that would eventually get amended with a successful anime, some patches and DLC.

    Now on to actors. Actors, whether A or Z-list, work for a flat fee and maybe royalties if they got really really lucky. Once they have completed their performance, their end of the contract is complete. They get paid and that’s that. They just wait for royalties to be exercised (if they have them).

    Having said so, the idea that Keanu’s agents hold any post-payment sway in comparison to the collective that owns literally more than 65% of the company is a bit silly. This is why you’re getting a little bit of backlash on what you have written. Especially in that you did not preface your original comment with “Hey, this is a theory, a game theory”.






  • Precisely what I’m talking about. They can afford to do so, since they lost the trust of the user about 2 statements from the CEO ago.

    And not to go too deep into it, but how the hell are you going to create a brand new pricing scheme in only “a couple of days”, without already having a draft of it ready? Don’t you wanna check in with your lawyer? Your CFO? This shit must take more than 2 days to do.


  • TsarVul@lemmy.worldtoTechnology@lemmy.worldUnity apologises.
    link
    fedilink
    English
    arrow-up
    85
    arrow-down
    1
    ·
    1 year ago

    We apologize for the confusion and angst the runtime fee policy we announced on Tuesday caused. We are listening, talking to our team members, community, customers, and partners, and will be making changes to the policy. We will share an update in a couple of days. Thank you for your honest and critical feedback.

    Allow me to translate:

    We’re now publishing the terms that we were actually going for from the very beginning. We’ve always known that the flaming bag of shit that we laid on your doorstep was unreasonable. If it worked, it worked, but if it didn’t, it can stand in contrast to the new less shit terms that you’re either supposed to agree to or rewrite your whole game. Not like our PR was great before this gambit. What have we to lose?






  • Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.

    On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.


  • Traditional hash like MD5 and SHA256 are not locality-sensitive. Can’t be used to detect match with certain degree. Otherwise, yes you are correct. Perceptual hashes can create false positive. Very unlikely, but yes it is possible. This is not a problem with perfect solution. Extraordinary edge cases must be resolved on a case by case basis.

    And yes, simplest solution must be implemented first always. Tracking post reputation, captcha before post, wait for account to mature before can post, etc. The problem is that right now the only defense we have access to are mods. Mods are people, usually with eyeballs. Eyeballs which will be poisoned by CSAM so we can post memes and funnies without issues. This is not fair to them. We must do all we can, and if all we can includes perceptual hashing, we have moral obligation to do so.