It is the first time that the storage space is almost unlimited. It is not only possible to have petabytes or even zettabytes of cloud storage, but also we can have it, in an affordable price. Cloud storage providers – such as Amazon – have reserved fleets of trucks, each one transferring hundreds of petabytes, or even better “tonnes” of data, every day to the cloud storage. Different kinds of information are stored and are accessible anytime from anywhere. IoT sensors information is transferred in real time from all over the world, business transactions are stored and are available for years, mobile devices are constantly uploading data of any kind that we can imagine: sensor data, images, videos, positioning information. Health devices are providing information from monitoring devices (ie pulse rate, BPM, SpO), information about the medication progress or symptoms of the patient but also information about the patients’ habits (ie daily steps or sleep quality). This information can be securely stored and provide a valuable tool for the researchers.

In addition, processing power cost has been dramatically reduced. Clusters with hundreds of CPU can be bought for few thousands of euros in ebay. Moreover, the processing power is located in the same place where the data is: in the cloud. Virtual environments, dockers, containers are available to accommodate any platform that is needed for serving our needs, optimising the data usage. Elastic allocation can provide us the resources needed the time needed. All the above are tied together in user-friendly platforms such as Apache Hadoop making possible to deploy and provision a new cloud platform only in few hours.  

What we can expect from the big data for the new year?

“Big data” was rapidly grown the last few years and we are now in place to form the brain of the new internet era. Software development is transformed to software science. We do not care about the data as data, but instead, we now searching for the actual information included in the abundance of data, depending on our task. Advanced signal processing algorithms are used to extract information based on the requirement, correlate different types of information and furthermore being able to make decisions. Machine learning, neural networks or deep learning, as it used to be called in “big data”, making the software scientists the most highly paid professionals, constructing the internet brain. People living in big cities are already familiar with applications, like the traffic information provided by google maps. This is an excellent example of Big Data. We do not care about who when and why is moving, we just want to know the status of the traffic in a specific place. Information is available – from the thousands of devices moving in this area – and big data are responsible for processing the information giving us the status of the traffic in real-time. Moreover, we may predict the traffic for a specific time based on the historical data of similar days, period or even weather conditions. We may see internet taking part actively in criminal prevention, identifying risks and managing portfolios without any human interactions. Big data will become a part of our daily life as the 21st century Pythia.

By admin