- #1
Gear300
- 1,213
- 9
I ask this with more of a software background than an engineering background, but here I go anyway.
Big data is arguably the cultural motif or monograph of the information age. Trends involve immersing ourselves in media of various sorts and processing them at exceptional rates. Of course, the information could be poisoned with false info, propaganda, virulent memes, and other redundancies, but at the very least, none of this is ahistorical. Other trends include the fourth paradigm in science, or data science; business compaction and migration to the cloud; streaming data, or data that exists in the network and not just at endpoints; and so on.
So the question here has to do with the growing demand and complexity of data. As it grows, it grates against our architectures and algorithms. The epitome of classical computing lies with parallel architectures, both horizontal and vertical, e.g. microservices in industry solutions. Big cloud vendors run on networks with a bisection bandwidth on the scale of petabytes, and they do what they can to optimize data storage and processing. But is there a credible futurist or tech-economist who expects there to be a saturation point in the growing demand for data? Sort of like one that prefigures the intermittency of classical computing in the early information era ?
Big data is arguably the cultural motif or monograph of the information age. Trends involve immersing ourselves in media of various sorts and processing them at exceptional rates. Of course, the information could be poisoned with false info, propaganda, virulent memes, and other redundancies, but at the very least, none of this is ahistorical. Other trends include the fourth paradigm in science, or data science; business compaction and migration to the cloud; streaming data, or data that exists in the network and not just at endpoints; and so on.
So the question here has to do with the growing demand and complexity of data. As it grows, it grates against our architectures and algorithms. The epitome of classical computing lies with parallel architectures, both horizontal and vertical, e.g. microservices in industry solutions. Big cloud vendors run on networks with a bisection bandwidth on the scale of petabytes, and they do what they can to optimize data storage and processing. But is there a credible futurist or tech-economist who expects there to be a saturation point in the growing demand for data? Sort of like one that prefigures the intermittency of classical computing in the early information era ?
Last edited: