What is HP's memory-driven computing?

  • Thread starter Thread starter ShayanJ
  • Start date Start date
  • Tags Tags
    Computing
AI Thread Summary
Memory-driven computing is a new architecture introduced by HP that emphasizes leveraging memory resources to enhance computational speed. The core concept is that increasing memory allocation for a program can significantly reduce processing time, allowing for faster computations. For instance, a program requiring 1GB of memory might take an hour to complete, while a different version utilizing 4GB could finish in just half that time. However, the relationship between memory usage and time savings is not linear and varies depending on the specific program and its computational demands. This method may not universally apply to all programs, as computational complexity theory suggests there are limits to how effectively memory can be used to reduce execution time. Additionally, there are discussions about integrating Linux as on-chip firmware and using optical fiber buses to enhance performance further.
ShayanJ
Science Advisor
Insights Author
Messages
2,801
Reaction score
606
I just read about HP's new computing technology, memory-driven computing. But I can't figure out the idea behind it. Can anyone explain about it?

Thanks

https://www.hpe.com/us/en/newsroom/news-archive/press-release/2016/11/1287610-hewlett-packard-enterprise-demonstrates-worlds-first-memory-driven-computing-architecture.html

http://www.preposterousuniverse.com/blog/2016/12/20/memory-driven-computing-and-the-machine/
 
Computer science news on Phys.org
If it is what I think it is, then the idea is that what you lose in memory resource you gain it in the time resource. So in simple words if a program is made to run using 1GB of memory and takes 1hour to complete, you can make a different program that will compute the same thing but using say 4GB of memory but this second program will take only 0.5hour to complete. The constants are not directly proportional, it depends on the program and what it computes, you might need 10x or even 100x the memory to drop the time required by a factor of 2 (half the time). And of course it is not guaranteed that this method works for all sorts of programs. There is a theorem in computational complexity theory about this but right now I can't remember the name of it or its exact formulation.
 
Last edited:
linux as on-chip firmware with optical fiber buses?
 
In my discussions elsewhere, I've noticed a lot of disagreement regarding AI. A question that comes up is, "Is AI hype?" Unfortunately, when this question is asked, the one asking, as far as I can tell, may mean one of three things which can lead to lots of confusion. I'll list them out now for clarity. 1. Can AI do everything a human can do and how close are we to that? 2. Are corporations and governments using the promise of AI to gain more power for themselves? 3. Are AI and transhumans...
Thread 'ChatGPT Examples, Good and Bad'
I've been experimenting with ChatGPT. Some results are good, some very very bad. I think examples can help expose the properties of this AI. Maybe you can post some of your favorite examples and tell us what they reveal about the properties of this AI. (I had problems with copy/paste of text and formatting, so I'm posting my examples as screen shots. That is a promising start. :smile: But then I provided values V=1, R1=1, R2=2, R3=3 and asked for the value of I. At first, it said...
Back
Top