1TB RAM memory in one stick

MathematicalPhysicist
Gold Member
4,138
149

Main Question or Discussion Point

How far are we from a one card of memory of 1TB?

I mean not using several memories, but just one memory card with a capacity of 1TB?
My question is regarding the memories in desktops or laptops.
 
Last edited:

Answers and Replies

Wrichik Basu
Insights Author
Gold Member
2018 Award
1,197
1,026
MathematicalPhysicist
Gold Member
4,138
149
1,441
775
Regarding RAM memory sticks, I think it'll be about 3-4 years: that is, for high-end server, datacenter or such.
Within only the scope of desktop or laptop PC RAMs then maybe 8-10 years, since that requires some kind of demand for such monsters to build up first.

If the question is about some kind of storage instead, then those are already available.
 
288
112
If you are looking to add / replace RAM in your PC, it's usually easier to just note what specific model your PC is and then check for memory cards that work with it. I've added RAM to an old Asus laptop and an old Dell Inspiron desktop in the last year and it was easy and relatively cheap to find 4GB and 8GB sticks but 1TB?

Honestly, I didn't even look, and a cursory search finds lots of SSD drives but not much RAM and anything promising seems to be NVMe-based and requires a PCIe bus, so is probably really storage and that's possibly not what you are looking for.

You may get what you want as a service. Azure has a 1TB RAM IaaS option (it aint cheap, though) and I'd expect AWS and Google will as well, I've not used them as much but clouds are pretty much equivalent at the base service level these days.

What are you doing that requires so much RAM?
 
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
23,453
5,881
It took 8 years to get from 16 GB to 128 GB. That sets the scale.

What is your application? I don't think any CPU out there supports this much memory. More generally, why build such an unbalanced machine? (Only one processor per TB of memory)
 
MathematicalPhysicist
Gold Member
4,138
149
It took 8 years to get from 16 GB to 128 GB. That sets the scale.

What is your application? I don't think any CPU out there supports this much memory. More generally, why build such an unbalanced machine? (Only one processor per TB of memory)
It's like the saying that is attributed (perhaps wrongly) to Bill Gates:
https://www.computerworld.com/article/2534312/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.htmlyou can never really know if you need something unless it's available to you.
Perhaps this is far beyond today's technology, I have the patience.

Well I am contemplating using my computer to solve numerically DEs which I am quite sure will slow my current 16 GB machine.

Perhaps it's a good advice to add some perhaps 16 GB sticks to my current machine.
I'll check google for how to install this.

I appreciate everyone's input!
cheers!
 
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
23,453
5,881
The fact that someone said something that turned out to be wrong 40 years ago is not the best reason to conclude there is a need for a product nobody seems to be buying. One could build a 786 GB single socket computer today, using 128 GB sticks. At $15,000 each. Doesn't seem to be that much call for it.

In those 40 years, CPU speed has increased by about 100,000 and memory capacity by around 40,000. The ratio is more or less constant. That's what I mean by balance. You're asking about a technology that would reduce the CPU/memory ratio by an order of magnitude. I don't see a reason for that.

You might also consider what you are putting into that terabyte. If it's coming from disk, you will need of order an hour just to fill the memory. If you don't need all of it, it's more efficient to read it when you need it. If you need all of it once, it's more efficient to read it from disk. If you need some of it more than once, but you don't know ahead of time which parts you need, memory will help, but there are efficient cache algorithms that don't need to cache all of it, and faster and parallel disk systems to reduce the penalty of a cache miss. For $90,000 you can buy a lot of speed-up.

If, on the other hand, you're using this memory to save the results of calculations, that argues for a higher CPU/memory ratio, not lower.
 
Last edited:
1,140
223
Well I am contemplating using my computer to solve numerically DEs which I am quite sure will slow my current 16 GB machine.
What sort of DEs? Do you know any algorithms for the class of problems you are interested in that are helped by a large amount of RAM - all the ones I know are limited by CPU speed and L1 cache?
 
43
5
What sort of DEs? Do you know any algorithms for the class of problems you are interested in that are helped by a large amount of RAM - all the ones I know are limited by CPU speed and L1 cache?
In Mathematica in Statistical Analysis, very often I am using ParallelTable[1,000,000,000,000] command to find an accurate probability say, or in Monte Carlo methods. This requires TB of DDR RAM to execute.
This is not playing games in your PC to be happy with 16GB of memory!
 
1,441
775
This requires TB of DDR RAM to execute.
If the question is not limited to a single stick of memory, then it is not really a problem: even doable for (relative) cheap. The first generation of (single piece, low level) server hardware which is able to host that amount of RAM already made its way to the second hand market.

Alternative is to do it through distributed computing or by purchasing some time on a bigger computer.
 
288
112
or in Monte Carlo methods
I worked with a government regulator on their data strategy and they do large scale Monte Carlo simulation, but all in the cloud (they use Google and AWS). It's very expensive to provision sufficient compute on premise, but it's also not cheap to do it in the cloud.

The takeaway for me was that you either accept your stats calcs are going to take time on slower, lower cost, equipment, or pay for the necessary compute to speed it up.
 
832
386
It took 8 years to get from 16 GB to 128 GB. That sets the scale.

What is your application? I don't think any CPU out there supports this much memory.
A 64-bit machine can address ##2^{64}## bits, which is about 18.4 exabytes; however, the motherboard and OS generally will not. IBM z/OS mainframe architecture is designed to support 4TB per LPAR (logical partition). Akthough most of the larger mainframes now have 4 cores, there are some single-CPU IBM z/OS machines that can support 30 LPARS, so that means 120TB on a single CPU.

The z13 machine, which has 64 cores, can have up to 80 LPARS, and you can put on e.g the z13, with 64 . ref: https://www.redbooks.ibm.com/redbooks/pdfs/sg248251.pdf
More generally, why build such an unbalanced machine? (Only one processor per TB of memory)
Most PCs allow memory sticks to be shared by multiple processors. You can buy an quadcore or octacore machine that has only 1 or 2 memory sticks. The memory management system can distribute access to the capacity of a single stick over multiple processors.
 
1,140
223
There seem to be at least 3 different questions being answered here:
  1. When will 1 TB memory sticks be available in desktops or laptops?
  2. When will 1 TB memory sticks be produced?
  3. When will high end workstation/server configurations with 1 TB RAM be available?
Each of these is relevant to the original question but has very different answers which are becoming confused so I will attempt to separate and summarize the answers:

When will 1 TB memory sticks be available in desktops or laptops?
Not within the forseeable future - the trade off between cost (including power consumption) and performance compared with the demands of single-user desktop computing mean that there is no demand for 1 TB RAM in a single stick (which implies 2 TB RAM per installation as this is more efficient) to meet any current or forseeable future need for consumers of desktops or, particularly due to the power consumption point, laptops.

When will 1TB memory sticks be produced?
As it took 8 years to get from 16 GB to 128 GB (8x), and we already have 256 GB then 2025-2028 seems a reasonable estimate.

When will high end workstation/server configurations with 1TB RAM be available?
They have been available for some time - you can rent them surprisingly cheaply e.g. AWS x1.16xlarge (64 cores, 976 GB) in a US data centre for less than USD7 per hour (source - other cloud services are available). You can buy a barebones desktop workstation supporting 1.5 TB RAM (source) - not sure how much this costs, but populating it is not going to be cheap!

Edit: added 1.5 TB workstation information.
 
Last edited:
767
257
IMHO, such may appear integrated onto 'distributed processing' accessory cards, like 'external' Graphics cards, but rather larger. In effect, your PC becomes 'smart terminal' ...

Um, Google just found mention of GPU cards with 32 GB, even 48 GB VRAM, with umpteen RISC cores.
Given my custom CAD-Tower has 8 cores, 32 GB RAM and twin 2 GB GPU cards, I'm a bit stunned...

(Those cards were 'Last Year's Gaming Tech' and would not economically mine bit-coin, hence affordable... )

'Cloud' or 'Networked' computing is probably more affordable, but there will be 'iron-walled' applications where spreading the data around is unwise...
 
OCR
813
662
1,140
223
" source " doesn't seem to go very far for me. . . . 🙄
Yes, defeated by deep linking/country of origin/some other protection I suppose. Well anyone that wants it badly enough can track it down on AWS...
 
OCR
813
662
OCR
813
662
Last edited:

Related Threads for: 1TB RAM memory in one stick

  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
14
Views
885
  • Last Post
3
Replies
60
Views
3K
Replies
3
Views
399
Replies
3
Views
764
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
3
Views
3K
Top