Granular computing or GrC is an emerging model of information processing where data undergoes division into information granules. An information granule refers to a collection of entities. Entities, meanwhile, refer to the results of numerical analysis and data arrangement, depending on their functional or physical characteristics, similarities, and differences.
Granular computing is not an algorithm per se but an approach that divides information into smaller pieces to see if they differ on a granular level. The relationships seen are then used to design machine learning (ML) and reasoning systems.
Read More about “Granular Computing”
Granular computing can be likened to a human solving for the value of N in a mathematical problem such as “N = 6 x 5 / 3 + 2,556 – 456.” Following the MDAS rule or the order in which operations should be performed (i.e., multiplication, division, addition, subtraction) and using parentheses to separate the “granules” will make the task easier. If the problem were to follow granular computing, you would first multiply 6 by 5 and get 30, then divide 30 by 3 and get 10, then add 2,556 to 10 and get 2,656, and finally subtract 456 from 2,656 and get the final answer 2,200. By dividing the equation into smaller chunks, it becomes easier to solve.
As a theoretical concept, GrC pushes for an approach to data analysis that recognizes and exploits the knowledge present in the information at various levels of resolution or scale. It takes into account all methods to solve a problem, depending on how knowledge or information is divided into smaller parts.
How Granular Computing Works
GrC works by dividing a complicated problem into several simpler ones, then looking for similarities and differences. This process allows several applications to work simultaneously on different problem areas, making analysis and resolution faster.
Imagine a satellite image. At a lower resolution, you may only see clouds. But, by increasing the image’s resolution or granulation, you would see other details. In sum, granular computing allows users to process information and identify all possible problems in greater detail.
Uses of Granular Computing
Granular computing is essential from a theoretical point of view, as it is highly useful in improving human problem-solving. It also has a significant impact on the development and creation of intelligent systems.
From a practical point of view, however, GrC shows excellent potential in addressing cybersecurity issues. At present, a study shows that the concept can help in network traffic classification and intrusion detection. Segmenting massive amounts of network traffic data and distributing the work among more systems can speed up the process. The same is true for intrusion detection, which relies on traffic analysis and correlation as well.
Today, granular computing remains a concept but is showing signs of usefulness for developing intelligent information systems and providing practical problem-solving methods.