Experts in cloud computing have been working with NASA to adapt their open source programs in order to handle the masses of data the space agency receives every day. Hundreds of terabytes of data are collected from NASA’s missions on an hourly basis which, of course, presented a Big Data challenge for the agency. But it appears that a team at the Jet Propulsion Laboratory (JPL) at NASA may have come up with a range of new strategies that will be able to handle the collation, storage, processing and accessing of the data, enabling them to gather the best intelligence from that data.
Eric De Jong, a member of the JPL team, said: “Scientists use big data for everything from predicting the weather on Earth to monitoring ice caps on Mars, to searching for distant galaxies. We are keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps and movies to find patterns and verify theories.”
Just archiving the data gathered can be a rather big task, but specialists at the agency have managed to utilise their existing hardware, adapting current open source programs and utilising cloud computing techniques to do just that; archiving the masses of data that can then be accessed and used. The team is also coming up with new methods that will make the data more versatile and accessible for public use.
Chris Mattmann, who is a principal investigator for JPL, said: “We don’t need to reinvent the wheel. We can modify open source computer codes to create faster, cheaper solutions.”
The age of technology!
By Allie Philpin