Friday, May 19th, at Learning Friday Dario Santilli talked about “Big Data for Dummies”, providing us theoretical basis for approaching the theme.
What does Big Data mean? Do we really know the meaning of this word? We started from here, from the explanation of the term now entered in the common language.
We are talking about big data to describe a collection of data so extensive in terms of volume, speed and variety that require specific analytical technologies and methods for extracting value.
A real example of what BIG means: every day millions of searches are done on Google, in particular 4 millions searches for minute!
Given the tremendous amount of data you need to manage, you need to have specific methods for collecting them, using schemaless databases such as graphs or column family where you can get to describe tens of thousands of columns.
Over time, other management systems have also been developed to connect information in the best way and to respond at the increasing complexity of the data: the larger size of the dataset, the greater complexity of the data to handle.
Dario went on to explain the six phases of the Big Data Process, illustrating the ultimate goal of this labourious process and ending up with a “Funny Use Case”.
Do you want to get into the subject? Click on “View Attachment”!