Open In App

Big Challenges with Big Data

Last Updated : 14 Jan, 2019
Like Article

The challenges in Big Data are the real implementation hurdles. These require immediate attention and need to be handled because if not handled then the failure of the technology may take place which can also lead to some unpleasant result. Big data challenges include the storing, analyzing the extremely large and fast-growing data.

Some of the Big Data challenges are:

  1. Sharing and Accessing Data:
    • Perhaps the most frequent challenge in big data efforts is the inaccessibility of data sets from external sources.
    • Sharing data can cause substantial challenges.
    • It include the need for inter and intra- institutional legal documents.
    • Accessing data from public repositories leads to multiple difficulties.
    • It is necessary for the data to be available in an accurate, complete and timely manner because if data in the companies information system is to be used to make accurate decisions in time then it becomes necessary for data to be available in this manner.

  2. Privacy and Security:
    • It is another most important challenge with Big Data. This challenge includes sensitive, conceptual, technical as well as legal significance.
    • Most of the organizations are unable to maintain regular checks due to large amounts of data generation. However, it should be necessary to perform security checks and observation in real time because it is most beneficial.
    • There is some information of a person which when combined with external large data may lead to some facts of a person which may be secretive and he might not want the owner to know this information about that person.
    • Some of the organization collects information of the people in order to add value to their business. This is done by making insights into their lives that they’re unaware of.

  3. Analytical Challenges:
    • There are some huge analytical challenges in big data which arise some main challenges questions like how to deal with a problem if data volume gets too large?
    • Or how to find out the important data points?
    • Or how to use data to the best advantage?
    • These large amount of data on which these type of analysis is to be done can be structured (organized data), semi-structured (Semi-organized data) or unstructured (unorganized data). There are two techniques through which decision making can be done:
      • Either incorporate massive data volumes in the analysis.
      • Or determine upfront which Big data is relevant.

  4. Technical challenges:
    • Quality of data:
      • When there is a collection of a large amount of data and storage of this data, it comes at a cost. Big companies, business leaders and IT leaders always want large data storage.
      • For better results and conclusions, Big data rather than having irrelevant data, focuses on quality data storage.
      • This further arise a question that how it can be ensured that data is relevant, how much data would be enough for decision making and whether the stored data is accurate or not.
    • Fault tolerance:
      • Fault tolerance is another technical challenge and fault tolerance computing is extremely hard, involving intricate algorithms.
      • Nowadays some of the new technologies like cloud computing and big data always intended that whenever the failure occurs the damage done should be within the acceptable threshold that is the whole task should not begin from the scratch.
    • Scalability:
      • Big data projects can grow and evolve rapidly. The scalability issue of Big Data has lead towards cloud computing.
      • It leads to various challenges like how to run and execute various jobs so that goal of each workload can be achieved cost-effectively.
      • It also requires dealing with the system failures in an efficient manner. This leads to a big question again that what kinds of storage devices are to be used.

Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads