Big data is everywhere today. Right from the shopping websites we use to buy our daily household necessities, to the programs used to track sales of an enterprise organization. But the penetration of big data is far and wide reaching. It wouldn’t be wrong to say, it’s out of this world, quite literally! NASA is using Big Data in space probes and more to help them solve other worldly problems. Here are some examples of how NASA is implementing Big Data technology:
Real Time Analytics through Elasticsearch
The team in control of NASA’s Mars Rover spacecraft now has Big Data driven analytical engines at their fingertips. The open source Elasticsearch technology used by companies like Netflix and Goldman Sachs is used to plan the actions of the Rover, which was landed on Mars in 2012. NASA’s Jet Propulsion Lab’s mission planning, which runs the day-to-day mission planning, has now rebuilt its analytics systems around an Elasticsearch that processes all of the data transmitted from the Rover during its four daily scheduled uploads.
Elasticsearch, which has achieved its 50 millionth download, means that anomalies and patterns in the datasets can be identified instantly. The rate of malfunction and failure can be greatly reduced, as correlations can provide mission-critical insights that can lead to a greater rate of scientific discovery. Anomaly resolutions are an application of this.
When a problem with the spacecraft has been identified, accurate details of its operations can be analysed immediately to find out the last time this situation occurred, and what other elements were involved at that time.
The Soil Moisture Active Passive (SMAP) project, which was launched in 2015, also uses Elasticsearch. It is said, that the most interesting aspect of the Mars exploration is the possibility of identifying whether or not the planet has life. It is assumed that this question will be answered quickly by the use of Elasticsearch. By 2020, the mission is to take life-detecting machines to Mars. The process needs to be fast, and hence technologies like Elasticsearch would indeed be a boon in such scenarios.
The SKA ProjectSquare Kilometres Array (SKA) is one of the biggest projects that has been currently planned. The construction will commence in 2018 and will be finished in the mid of 2020. The SKA will be the world’s largest radio telescope, essentially a grouping of thousands of satellites that work together and peer deep into the farthest reaches of space.
It will be capable of detecting even the weakest of signals from up to 50 light years away. The concept behind the project is to study more about the unknowns of our universe, right from black holes to the darkest of matter to the undiscovered planets. The potential of this project is just as huge as the construction itself, if not bigger.
So how is SKA related to Big Data? In reality, the SKA project is rather the very definition of big data. It is predicted that the SKA will generate up to 700 terabytes of data per second. To put that in another viewpoint, that is almost the same chunk of data transmitted through the whole internet every two days. Needless to mention, that is a huge amount of data, and it is launching some unique challenges for astronomers and data scientists. Most of these issues are related to storing and processing these large data chunks.
NASA identifies this as a hindrance not only with SKA but also with space as a whole. Hence, they are in the process of developing tools to handle storing and transmitting of this data. Data transmission through space has proven to be a big challenge. Transmitting data from deep space satellites and other spacecraft using radio frequencies is a very slow process. Therefore, scientists have invented better ways of doing the same, using optical communications, which will help increase download speeds, as well as the amount of data that can be transferred.
How NASA used big data to know more about Pluto:
In 2015, the New Horizon spacecraft had very few hours to fill its memory banks with as much data as possible from the dwarf planetary system. Finally, on October 25th, the last few hundred bits of that data finally arrived in one of NASA‘s profound space radio dishes.
Imagine for a moment that before the flyby, the dwarf planet was a pixelated blur. Scientists had faint ideas about Pluto’s atmosphere, geology, and satellite system, and, ‘pixelated’ worked as the perfect analogy for the clarity of those concepts. Now, New Horizons did not just deliver a picture of Pluto’s massive heart, the survey flyby exposed the dwarf planet as one of the most dynamic and complex worlds in the solar system. Together, New Horizons collected 6.25 gigabytes of data. The spacecraft is equipped with seven different instruments like the multi-spectral imagers, particle sniffers, and dust collectors. When New Horizons was 3 billion miles away, the ground team collected the flyby data bit by bit, in 469 days.
Every bit of data on New Horizons’ drives is indexed according to when it was collected. Its engineers had to calculate exactly how many seconds it would take the probe to reach Pluto and then instruct the computer with things like, “At 299,791,044 seconds after New Horizons has left Earth, roll 32 degrees, pitch 15 degrees, yaw 256 degrees, and activate the Ralph imager for .25 seconds.”
New Horizons was sending data every week for a year and a half. The images showed plains of frozen nitrogen, atmospheric haze layers, cryogenic volcanoes, vast glaciers, mountains made of water ice and geologic evidence that Pluto has been tectonically active for 4.5 billion years. The images also showed surprises like Charon’s 600-mile minimum equatorial canyon—from each of Pluto’s five satellites.
These instances are only a few scenarios to explain the impact big data is having on space exploration. Though they seem rather conceptual, this progress could have a major impact on our routine activities. Sometimes, the farthest-reaching discoveries can have an enduring impact close to home. Whatever the case, big data is enabling scientists to get a clear picture of the universe. This indeed is just the beginning. Can you visualize what the future has in store?
Learn R, Python, basics of statistics, machine learning and deep learning through this free course and set yourself up to emerge from these difficult times stronger, smarter and with more in-demand skills! In 15 days you will become better placed to move further towards a career in data science. Upgrade to the specialization programs at attractive discounts!
Don't Miss This Absolutely Free, No Conditions Attached Course