• .:\dGh/:.@lemmy.ml
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    1 year ago

    Well, that’s a bummer, but it will be interesting to see how it stacks up on day-to-day usage.

    It’s not that the folks on the base M3 are going to stress out the machine with high computation tasks, but the Pro and Max surely will have enough people talking about synthetic benchmarks vs real benchmarks to see what optimizations Apple made that and are not paying off.

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 year ago

    I’m pretty close to getting a used m1 air for $500.

    I can probably search a bit and get a slightly better deal.

    The price might be a bit high, but I’m not in the US and we have higher prices here.

      • TagMeInSkipIGotThis@lemmy.nz
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yeah me too; I bought it to replace a 2013 MBP. Its so light, the battery life is rediculous, and its far gruntier than I need for the work I do which is mostly in a shell / nvim etc anyway.

        • DJDarren@thelemmy.club
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I have 15" M2 Air, and honestly, this laptop will last me for longer than Apple will want it to. An absolutely astonishing bit of engineering.

          • TagMeInSkipIGotThis@lemmy.nz
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Heh, well yes i’m sure they would have rather I didn’t hang on to my last one for 10 years; in fact its still going too - like i’d done with the last two macbooks i’d owned it went as a hand me down to my father who just uses it for email & web browsing. I’m hoping the Air will be around a similar amount of time - it will probably come down to battery & flash degradation over time I suspect.

        • B0rax@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Well the comparison to a now 10 year old machine is also not quite apples to apples.

          • TagMeInSkipIGotThis@lemmy.nz
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Ah well that was just me replacing my personal laptop, so the 10 year old machine had been outperforming my 3 different work laptops (typically Lenovo, running Windows, refreshed every couple of years) all the way up until I got the Air.

    • w3dd1e@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I just got one for around $600 in the US on Swappa. I tried to get one cheaper but couldn’t find it where I lived. Anyway, I’m super happy with it. I made sure it was a low number of battery cycles and it’s in near mint condition.

      The other day, I was coding in VSStudio, debugging JavaScript in Chrome with multiple tabs open, and logging issues I found on a template in Excel. Excel alone makes my work computer freeze and I didn’t notice a single slow down on this thing. It was fantastic.

      I don’t love the way Mac handles open-window management but aside from that I’m very happy.

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Do you have 8gb of ram in your machine?

        There is an electronics market where I live. I have a recentish lenovo it actually might be a year newer than the M1 so I am going to try and swap it. Maybe I can go next week.

        • w3dd1e@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah, just 8. I was worried about only 8 actually but I couldn’t bring myself to spend the extra money on the 16gb (I have a desktop if I need to fall back on it).

          So far so good. I haven’t even noticed hitting a wall with the low amount of ram. I forgot to mention, I’m just coding websites. Even with the JavaScript, I’m not building AAA or doing a ton, really.

  • Perfide@reddthat.com
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    What the fuck is with this trend of releasing a great product and then 2-3 generations later nerfing the shit out of the memory bandwidth? Nvidia, Intel, I think AMD, and now Apple are all guilty of this recently.

    • AwesomeSteve@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      This is why competition is always a must, when Apple released the M1 series, the entire tech industry basically shitting on Intel, and spell the death of x86. Make no mistake, Arm-based chips are leading years ahead for efficiency and performance per watt. Qualcomm’s Snapdragon Elite X and Microsoft already signed the exclusive deal to sell Arm-based laptop running Windows, coming in 2024. Nvidia and AMD also announced Arm-based PC chips to market in 2025.

      On the GPU front, Nvidia basically abandoned the former customers (gamers) that put the company into a trillion dollar company, Jensen Huang now focus on server and AI chip market now that selling 1000x times the MSRP gamer-tier RTX GPU a piece. Just look at the RTX 4000 series pricing and the VRAM for entry and mid-tier cards. Intel Arc and AMD Radeon are decades behind Nvidia in terms of software API, the CUDA ecosystem is the one that allowed Nvidia to basically monopolize the AI field and milking its customers. Gamers are no longer needed by Nvidia, they will continue to release subpar GPU that barely keep up with the expected generational leap, by CUDA cores, VRAM, memory bandwitdth, some are even downgraded, ffs.

    • dinckel@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      5
      ·
      1 year ago

      This is no mistake. They’ve purposefully cut a ton of corners with the M1, to make it stand out in exceptional ways. Thankfully for them, everything aligned perfectly in the timeline they expected, however, cutting corners won’t cut it anymore. Now that the dust has settled, they need to keep up with the r&d, and push forward again. It is not an accident, that they’re comparing M3 performance to the M1 series

      • Moneo@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        1 year ago

        I don’t understand your comment at all. They cut corners how? Why is it no accident they are comparing M3 to M1.

  • TBi@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    3
    ·
    1 year ago

    I really am interested to see it go up against the new snapdragon elite chip. Hopefully some competition at last!

    • Sowhatever@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      I’d love a system with the efficiency of ARM and real out of the box support for Linux. Like a state-of-the-art Raspberry Pi.

      • Synapse@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        I would love to see a state-of-the-art RISC-V laptop, but we are not quite there yet. A good power efficient Linux laptop running on ARM would be pretty cool in the meantime.

        • Sowhatever@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I’m actually surprised how fast RISC V SBC have caught up with ARM based ones, but a laptop needs a lot more polish and mass production to be worth it.

          As for ARM laptops, I’m afraid they will be windows only, secure boot or whatever, no GPU drivers, maybe even no wifi on Linux.

          I have an M2 MacBook from work and it’s the closest thing one can get. Really impressive performance and efficiency, and the OS is acceptable once you get used to it.

          • Open_Mike@artemis.camp
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I had some fun today trying to install Windows 11 on an m2 mac in a virtual machine. Couldn’t use virtualbox as their website was on holiday (bad gateway) so tried UTM. Set it to emulate x86 and have it a Windows 11 iso. About half an hour later it gets to asking about language ETC, and eventually crashed with an oobekeyboard error. Incredibly slow.

            Tried again with an arm windows 11 download and native UDM and it worked OK. Not as fast as I’d expect, but I only gave it 4gb of ram.

          • 2xsaiko@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            As for ARM laptops, I’m afraid they will be windows only, secure boot or whatever, no GPU drivers, maybe even no wifi on Linux.

            Considering Asahi Linux exists which makes Linux on M model MacBooks possible, yet Apple is notorious for locking down their stuff and giving no hardware documentation whatsoever, I wouldn’t say that outright. Secure Boot should allow you to enroll your own keys, for example…

            Although it would also be extremely funny for the only mainline ARM laptops that can run Linux to be MacBooks.

            • Sowhatever@discuss.tchncs.de
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Yes, but Asahi still doesn’t support everything on the laptop, like microphones or speakers, and last time I checked the power consumption was also not as good as Mac OS.

    • AwesomeSteve@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      The only way to put a shame on Apple is the Laptop OEM manufacturers ship their entry level laptops and mini desktop computers equipped with Snapdragon Elite X chip with baseline 16GB of RAM, and $50 to upgrade to 32GB, 512GB of SSD, and adequate price to upgrade without robbing the customers. If they really want to retain and re-capture the market share from Dracula Tim Cook before bleeding out much like the Nokia, this is a step to take before even more people get hooked up on Apple ecosystem, because once they are into the ecosystem, it is hard to get out from data and services they signed into it. I am curious if a class-action suit can be brought against Apple in the memory and SSD pricing debacles.

  • bbbbb@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    This was a real bummer for anyone interested in running local LLMs. Memory bandwidth is the limiting factor for performance in inference, and the Mac unified memory architecture is one of the relatively cheaper ways to get a lot of memory rather than buying a specialist AI GPU for $5-10k. I was planning to upgrade the memory a bit further than normal on my next MBP upgrade in order to experiment with AI, but now I’m questioning whether the pro chip will be fast enough to be useful.

  • mingistech@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    M3 Pro has 150GB/s bandwidth vs 200 for the M2 Pro. I think that can be explained by using 3 6GB/12GB modules for the RAM vs 4 on the M2.

    The M3 Max is listed as “up to" 400GB/s, where the M2 Max doesn’t have that qualifier. The 14 core I think is always using 3 24GB/32GB wider modules for 300GB/s, the 16 core is always using 4 for 400GB/s.

  • psycho_driver@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    No more Jim Keller architecture design. Same thing will probably happen to AMD when they need to move on from Zen. Bulldozer 2.0.

  • Nogami@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Doubt it will make a difference that anyone except benchmarkers will notice.

      • Viking_Hippie@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        That’s my point: it costs more but has less memory bandwidth, which people here seem to consider a GOOD thing, or at least thats what they seem to be trying to convince themselves and others of.

        • 4am@lemm.ee
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          It can be more complicated than “bigger number better”. I don’t think anyone’s trying to justify it, probably just speculate on why it is the way it is

          Maybe Apple discovered that most software’s bottleneck isn’t at the RAM access for user land operations but is with cache misses, and they sacrificed some of the circuitry supporting memory access speed for additional on-die memory? So while you have less RAM bus speed, it doesn’t actually matter because you never could’ve used it anyway?

          I don’t know any real world numbers of any of this, I’m spitballin’ here - but that’s an example of an optimization that could plausibly happen when you are working with hardware design.

          People have been talking shit about Apple since the early 90s, but their stuff still works and they’re still selling it so, miss me with that “no no THIS time they’re playing us all for fools! No, seriously, guys! Guys? STOP HAVING FUN!” nonsense.

          I’ll believe it when the benchmarks come out.