Armed with a powerful deep-sky-object capture rig, a set of Solidigm QLC SSDs, and a rugged Dell XR7620 server, Jordan Ranous of Storage Review recently explored the need for robust, cost-effective storage to manage the rapidly exploding data requirements of edge-based AI-accelerated scientific research.
What follows is a expert from his story posted at Storage Review. Please see that article for further specifications, details on the equipment used, and more on his testing methods.
In recent years, scientific and data computing has undergone a monumental shift, transitioning from traditional, centralized computing models to the more dynamic realm of edge computing. This shift is not just a change in computing preferences but a response to modern data processing exploration's evolving needs and complexities.
Edge computing refers to processing data near the location where it is generated, as opposed to relying on a centralized data-processing warehouse. This shift is increasingly relevant in fields where real-time data processing and decision-making are crucial. Edge computing is compelling in scientific research, especially in disciplines that require rapid data collection and analysis.
- Firstly, the sheer volume of data generated by modern scientific experiments is staggering. Traditional data processing methods, which involve transmitting massive datasets to a central server for analysis, are becoming impractical and time-consuming.
- Secondly, the need for real-time analysis is more pronounced than ever. In many research scenarios, the time taken to transfer data for processing can render it outdated, making immediate, on-site analysis essential.
- Lastly, more sophisticated data collection technologies have necessitated the development of equally sophisticated data processing capabilities. Edge computing answers this need by bringing powerful computing capabilities closer to data sources, thereby enhancing the efficiency and effectiveness of scientific research.
Scientific research, our edge computing focus for this article, is particularly interested in keeping as much raw data collected by modern, sophisticated sensors as possible. Real-time monitoring and analysis of the captured data using accelerators like the NVIDIA L4 at the edge provides summaries. Still, there is no replacement for capturing and preserving all data for future, more profound analysis. This is where ultra-dense Solidigm QLC SSDs come in.
Astrophotography, the practice of capturing images of celestial bodies and large areas of the night sky, is a prime example of a field that significantly benefits from edge computing. Traditionally, astrophotography is a discipline of patience, requiring long exposure times and significant post-processing of images to extract meaningful data. In the past, we looked at accelerating the process with a NUC cluster. Now, it's time to take it to the next level.
Preserving every sub-frame in astrophotography is vital for researchers as it unlocks a wealth of information essential for advancing astronomical knowledge. Each sub-frame can capture incremental variations and nuances in celestial phenomena, which is crucial for detailed analysis and understanding. This practice enhances image quality through noise reduction and ensures data reliability by providing redundancy for verification and aiding in error correction and calibration.
Taking our kit to remote locations and documenting all phases of astrophotography image captures and compilations help us understand how AI helps us in so many different aspects of life.
In the quest to push the boundaries of astrophotography, particularly at the edge where high-capacity storage and computational efficiency are paramount, a novel approach to improving the sharpness of an image, or deconvolution,1 is revolutionizing our ability to capture the cosmos with unprecedented clarity.
To accomplish this goal, we introduced a groundbreaking convolutional neural network (CNN) architecture that significantly reduces the artifacts traditionally associated with image deconvolution processes.
The advancement in deconvolution techniques undertaken in our lab marks a pivotal moment in imaging of all types. By innovatively leveraging deep learning, we stand on the brink of unlocking the additional potential of a digital image, here demonstrated by capturing the universe with clarity and precision previously reserved for only the highest end of configurations.
In our ever-evolving commitment to pushing the boundaries of technology and understanding its limits, we embarked on a unique testing journey with the Dell XR7620 server and Solidigm SSDs.
Our testing for this project was conducted in the harsh embrace of winter, with temperatures plummeting to -15°C and below, amidst a relentless snowstorm. These conditions are far beyond the normal operating environment for most electronic equipment, especially sophisticated server, hardware and SSDs designed for data-intensive tasks. The goal was to evaluate not just the performance but the reliability of the Dell XR7620 server and Solidigm SSDs when faced with the extreme cold and moisture that such weather conditions present.
Remarkably, both the server and the SSDs performed without a hitch. There were no adverse effects on their operation, no data corruption, and no hardware malfunctions. This exceptional performance under such testing conditions speaks volumes about the build quality and resilience of these devices.
We've been enamored with high-capacity enterprise SSDs ever since QLC NAND came to market in a meaningful way. Most workloads aren't as write-intensive as the industry believes; this is even more true when it comes to data collection at the edge. Edge data collection and inferencing use cases have an entirely different set of challenges.
The conclusion of our exploration into the unique applications of high-capacity enterprise SSDs, particularly of QLC NAND technology, underscores a pivotal shift in how we approach data collection and processing at the edge. The Solidigm D5-P5336 7.68TB SSD and P5336 61.44TB SSD we used in our tests stand out due to their capacity and performance metrics, enabling new research possibilities that were previously constrained by storage capabilities.
In projects like this, where every byte of data captured, a fragment of the cosmos, has value, in between weather and time constraints, the luxury of expansive storage arrays and racks upon racks of gear is not always available.
In essence, the Solidigm D5-P5336 7.68TB and P5336 61.44TB SSDs are not just storage solutions; they are enablers of innovation. They represent a leap forward in addressing the unique challenges of edge computing, facilitating research endeavours that push the boundaries of what is possible.
1 Deconvolution is an image post-processing technique that makes images appear sharper or in greater focus. (https://blog.biodock.ai/image-deconvolution/)
About Solidigm
Solidigm is a leading global provider of innovative NAND flash memory solutions. Solidigm technology unlocks data’s unlimited potential for customers, enabling them to fuel human advancement. Originating from the sale of Intel’s NAND and SSD business, Solidigm became a standalone U.S. subsidiary of semiconductor leader SK hynix in December 2021. Headquartered in Rancho Cordova, California, Solidigm is powered by the inventiveness of team members in 13 locations around the world. For more information, please visit solidigm.com and follow us on Twitter and on LinkedIn. “Solidigm” is a trademark of SK hynix NAND Product Solutions Corp. (d/b/a Solidigm).
Contact details
Related topics
Related news
Solidigm Extends AI Portfolio Leadership with the Introduction of 122TB Drive, the World’s Highest Capacity PCIe SSD
Solidigm™ D5-P5336 SSD improves power and space efficiency for critical IT infrastructure, meeting challenges from data center core to edge.
Solidigm Extends AI SSD Portfolio Leadership with Lightning-Fast Additions to their D7 Product Line
Solidigm™ D7-PS1010 and D7-PS1030 SSDs are built to turbocharge demanding modern workloads and power the most performance-critical stages of the AI data pipeline
Solidigm SSDs Help Break Another Pi Calculation Record
StorageReview Lab breaks pi calculation world record with over 202 trillion digits, consuming nearly 1.5 Petabytes of space across 28 Solidigm SSDs.
Another Serving of Pi: Solidigm SSDs Help Calculate New World Record
After successfully breaking the speed record for calculating Pi to 100 trillion digits last year, the team at StorageReview has taken it up a notch, revealing the known digits of Pi to 105 trillion...