Apple pulls its last 21.5-inch iMac from range – Computer – News

This memory boost is really challenging, and I’m curious about how Apple handles that.

So far, all products use low-power memory, either LPDDR4X for the M1 and LPDDR5 for the M1 Pro and Max models. These channels have 2, 4, and 8 memory channels of 64-bit width. They currently use 4 and 8 GB chips, which means the M1 has 8 or 16 GB, 16 or 32 GB for the M1 Pro and 32 or 64 GB of memory for the M1 Max.

Before we look at other types of memory, there are a few other possibilities with LPDDR5. For example, if we are in micron . catalog Look, see that there is already two units With a density of 96 GB (or 12 GB) is in production. There is also chip With a density of 128 GB (ie 16 GB).

With this, Apple can increase the memory capacity of its existing chips by at least 1.5x and beyond 2x. Then the M1 Max supports 96 or 128 GB of RAM. The current 27-inch iMac comes with up to 128GB of RAM, so that might be enough to replace it entirely.

It also appears that at least for the Mac Pro, multiple chips will be used, in a chip (-like) configuration. This will probably also apply to the highest configuration of the iMac (Pro). Until the beginning of this year, the iMac Pro was available with a maximum of 256 GB of memory, which can be achieved with two M1 Max chips.

Let’s say with four M1 Max chips in the Mac Pro, it can accommodate up to 512GB of LPDDR5 memory. With this you can replace the majority (especially in size) of the Mac Pro. In terms of power/heat dissipation, it certainly fits in a small cabinet, providing capabilities for your Mac Pro mini (or Mac mini Pro?).

See also  Battlefield 2042 gets a new scoreboard in February - Games - News

Another memory is needed to completely replace your Mac Pro. While DDR5 chips make a lot of sense, and you can easily offer 2 or 4 TB of them, I see Apple moving towards another option: HBM2e. We already know that stack memory comes from accelerators like the Nvidia A100, and it offers a huge amount of bandwidth in a very small form. The 80 GB version of the A100 achieves bandwidth between 1935 GB/s and 2039 GB/s with five HBM2e packages. That’s actually about 4x the bandwidth of the M1 Max.

Now I don’t see Apple throwing away 96 stacks of 16GB HBM2e too quickly to be able to offer 1.5TB of memory, but I think there’s a chance that 16-32 stacks would be the more advanced option, with 256 or 512GB of memory with bandwidth My memory frequency is between 6.4 TB / s and 12.8 TB / s. This can then be supplemented with “regular” DDR5 memory.

HBM3 is also an option in the future, with Stacks up to 24 GB And much more bandwidth per end, up to 6.4 Gbit/s.

Of course, that sounds like ridiculous numbers, but keep in mind that this is in common with both the CPU and all video, graphics, and AI accelerators. What set of available memory the speed and latency the cores have depends largely on which architecture Apple will choose. In any case, it appears that Apple is investing heavily in the shared memory of all the computing cores in the Apple Silicon chips we’ve seen so far, so they also now have the freedom to configure this entire architecture for their workloads. without being restricted by existing protocols or interfaces.

See also  WhatsApp users can now decide for themselves who can see their profile picture | Currently

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top