In the generally accepted the definition, Big Data is such large data sets that it is practically impossible to use them, but this definition is totally inadequate. It is because with the progressive improvement of hardware and software, the concept of "too big" must be continually revised upwards.
· Enterprise data: Set of shared data within an organization (Ex. Email)
· Public data: Data open to the public without any protection measures
· Sensor data: Set of data captured by the sensors (e.g., satellites)
· Social Media: Information shared online in the form of images, text, audio, and video
· Transactions: Set of all information resulting from transactions
Here we are going to discuss everything you need to know about Big data. Read on!
The vast majority of us use our smartphone every day to surf the internet and spend time on social networks. And every time it happens, we leave out a large amount of data without even realizing it.
Large companies such as Google, Amazon, or Netflix, for example, use this data that we provide to satisfy our personal needs, sell us some products, or tell us that TV series might like us based on our interests. The storage and dissemination of information have reached such a level that it is unprecedented. Every minute a quantity is produced so high that it was unthinkable only a few years ago.
According to studies, there are currently 2.7 Zettabytes data in circulation in the digital universe (1 zettabyte is equivalent to a trillion bytes!). And this data is expected to become even 180 ZB by 2025.
This massive amount of data taken individually is meaningless and generates only confusion. However, if these data are organized and divided according to a specific criterion, they can be beneficial. And this is where we start talking about Big Data.
Although on the web they are different definitions, for Big Data, refers to the study and the coordinated management of the set of all the data complex, of very high dimension different being produced today in any industry.
Features of Big Data
The term Big Data was coined in the early 2000s by Doug Laney, an industry analyst who subdivided data based on 3V:
1) Volume: In the world, the amount of data generated doubles every 12-18 months. The main reasons for this exponential increase are the high reduction in computational cost, analysis, and data storage (on average, 30-40% per year).
2) Speed: Today, the data is much faster to process, manage, and analyze: more and more news is obtained even in real-time to be analyzed. It is due to the continuous increase in Internet speed.
3) Variety: There are different forms of data, such as texts, numbers, maps, audio, video, email, and so on. For the most part, they are unstructured data, as opposed to the normal databases organized in diagrams and rigid tables. The extension of the definition has added another 2V to the initial subdivision:
4) Truthfulness: Refers to the quality, credibility of the data. For example, not all online content is trustworthy.
5) Value: The primary purpose is to derive value from the big data analysis. Before starting any activity, it is good to plan and estimate the real value from which it can be obtained from the business.
What are they for?
The purpose of big data is to analyze the tide of data available to extract information in a reasonable time and with limited resources.
If you think that until a few years ago to extract information from data analysis, it took a computer of millions of dollars and many hours, nowadays just a simple laptop and a few hours of use of sophisticated algorithms.
To whom it may be of interest
The big data analysis process and its subsequent extrapolation of "hidden" information are called big data analytics.
The expert finalized the analysis of these data to create value is the Data Scientist: is one who can tell if a given may be relevant to extract data from the knowledge (i.e., consistent information). The analysis of big data is carried out in the most different sectors; we see the main ones:
Education: The analysis of big data helps to understand better the level of learning of students, to understand if there are some gaps or improvements in the study methodologies, and consequently aims to improve the education system proposed by the schools.
Medicine: the analysis of big data makes it possible to predict the spread of diseases, in which place they thin out and carry out real-time monitoring and control. They can also be useful to counter potential epidemics.
Security: the analysis of a large number of data helps prevent terrorist attacks, theft, and vandalism. The same can be said about electronic payments, in which the police are already moving to identify the areas most at risk and with the highest crime rates.
Environment: in California, there is already the first intelligent water meters, which reduce water waste and help in times of drought. Big data is also analyzed to study weather events and evacuate areas that will be affected by natural disasters in advance.
Business: Analyze the purchasing behavior of consumers, from announcements to newsletters, study marketing campaigns, and monitor promotions.
Offers feedback: they are all systems that use information obtained from big data, which have allowed and will allow companies to increase profit margins.
Transport: Traffic management can be improved in real-time, by installing sensors in the various intelligent street lights that, through the exchange of information with a central system, measure and develop the urban traffic of urban cities.
Sport: the enormous quantity of data collected allows us to study a series of statistics unthinkable until a few years ago. These are then used to define the game strategies and analyze the opposing ones.
The future of big data is an open and rapidly expanding field: it is expected to increase professions that work in contact with data to create value and improve the well-being of society.
915 Words
Nov 29, 2019
2 Pages