Professor Dr. Zhai Junhai
Hebei University, China
Big Data, Machine Learning
Classification of Imbalanced Data: Traditional Methods, Modern Methods and Future Trends
In real-world applications, training data often exhibit an imbalanced class distribution, where a few classes contain a large number of instances, whereas many others have only a few instances, such as software defect prediction data, medical diagnosis data, credit card fraud detection data, spam filtering data, and severe convective weather prediction data. This phenomenon poses a challenge for machine learning, and it has aroused extensive attention of researchers during the last decade. In this keynote, we first survey the traditional and modern methods for the classification of imbalanced data, the traditional methods include three categories: data-level methods, algorithm-level methods, and hybrid methods. On modern methods, we focus on deep learning-based ones. Second, we present some of our proposed methods, including the method based on diversity oversampling and classifier fusion, and approach for imbalanced big data classification. Third, we analyze the future development trend of imbalanced data classification and present some potential research directions.
Professor Guisheng Zhai
Shibaura Institute of Technology, Japan
Applied Mathematics, Control Theory
Mathematics and Optimization in Consensus Control of Multi-Agent Systems
We first survey the problem formulation and recent development in consensus control of multi-agent systems, and establish a necessary and sufficient condition for designing consensus and formation in second-order multi-agent systems which are networked by digraphs (characterized by graph Laplacians). Assuming that the control input of each agent is constructed based on weighted difference between its states and those of its neighbor agents, we transform the control problem into designing Hurwitz polynomials with complex coefficients, which is a typical algebra problem and thus easy to solve. Secondly, as the application, we deal with the consensus control problem for a set of three-link manipulators, where the control inputs of each manipulator are the torques on its links and they are generated by adjusting the weighted difference between the manipulator's states and those of its neighbor manipulators. Then, we take advantage of the above obtained necessary and sufficient condition for adjusting the parameters in the control inputs, so that full consensus is achieved among the manipulators. Moreover, we extend the discussion to the case of designing convergence rate of consensus, and provide numerical examples to demonstrate the validity and applicability of the whole approach.
Assoc. Prof. Dr. Norhayati Rosli
Universiti Malaysia Pahang
Stochastic Modelling, Risk Based Inspection, Numerical Analysis
Derivation of the Stochastic Runge-Kutta Method for Stochastic Delay Differential Equations
Random effect and time delay are inherent properties of many real phenomena around us, hence it is required to model the system via stochastic delay differential equations (SDDEs). The growth of a microbe, for example, is non instantaneous but responds only after some time lag. Many diseases such as influenza, tuberculosis and coronavirus disease (COVID-19) have a latent or incubation period, during which the individual is said to be infected but not infectious. The aforementioned processes are also subjected to the uncontrolled factors of randomness. However, the complexity arises due to the presence of both randomness and time delay. The analytical solution of SDDEs is hard to be found. In such a case, a numerical method provides a way to solve the problem. In this research we propose the newly develop of stochastic Runge-Kutta of order 1.5 for solving SDDEs. The derivation of the method is presented, and the stability of the scheme is investigated in terms of mean-square (MS) stability.
Assoc. Prof. Dr. Norma Alias
Universiti Teknologi Malaysia
Industrial Computing, Big Data Simulation for Complex System on High Performance Computing (HPC)
Big data model , data analytics and huge simulation on high performance computing platform support the industry of nanotechnology
The principle of nanotechnology through a variety of definition, theorem and rule highlight the main current challenges in complex model, data analytics and huge simulation. It is dealing with Navier–Stokes equations, Watson–Crick base pairing rules, Schrödinger equation, Kuramoto-Sivashinskym, molecular dynamics rule, Newtonian dynamic equation, Stoke’s law , Von Neumann algorithm, Monte Carlo model, finite difference , finite element , finite volume discretization, quantum and statistical mechanics have become nanotechnology extremely complex. The discretization of model, functions and equations generate grid structure with fine granularity guaranteed to enclose terabytes of data allocation. Only the researcher with a strong fundamental mathematical model and the validation of numerical software able to describe the process of nanotechnology accurately. Based on this limitation, there is a new direction of mathematical modelling to build confidence and adapt the challenges by enabling large quantities of data generated globally. The requirement of big data analytics, machine learning and deep learning helps researcher to validate and verify the complex model. Thus, demand of high performance computing (HPC), high speed processors, large memory allocation and cloud computing will be addressed to support the large scale computation, high-speed simulation and high-speed memory access, in the industry of nanofabrication, quantum dots, carbon nanotubes, nano-design and prediction in making decisions. Many distributed, shared, hybrid memory, cloud computing and open source software support the HPC architecture. Some examples of nanotechnology applications with terabit data and huge number of parameters will be solved using classifier methods on the HPC platform. Numerical analysis, parallel performance evaluation and big data analytics indicators will be analysed the performance of complex model, machine learning and deep learning algorithms empirically. Under the current scenario, I believe the transition from complex modelling to big data analytics will fulfil the nanotechnology challenges for the next industrial revolution.
Dr. Mazlan Abbas
Co-founder and CEO of FAVORIOT Sdn. Bhd.
Internet of Things
Harnessing the Power of IoT & IR 4.0
For the last 50 years only 19% of companies are in existence today. Many did not survive or unable to sustain due to many reasons. However, one of the key reasons are due to the fact that they are unable to compete or remained competitive whereas their competitors have transformed their business by leveraging new technologies and new business models.
Digitalisation and Internet are seen to be the core for these transformation. New products and services have been created. Their operations are becoming more agile and productive. This is the outcome when companies embraced new changes openly. Their leaders are the risk takers and have excellent vision of the future.
Every 3 to 7 years companies need to transform themselves, reinvent their operations and business. They should look at the current and future trends of technologies and see whether they can leverage these powerful technologies and together surf the great waves of disruption. The time has come for us to harness the power of IoT and IR 4.0.
Mr. Nor Azmi Abdul Rahim
Senior Consultant, Creative Vision Sdn. Bhd.
Malaysia Industrial Mathematics and Statistics – Disciplines for Industry Growth Achievement Assurance
Accurate Industry Direction and Industry Advancement Achievement Assurance are 2 critical disciplines that contribute to the success of national industry economic growth and advancement plan. The former requires the subscription to the World Class Standards and Industry Best Practices, with discipline implementation and assurance of Economic Return achievement. The later requires determination of well rationalized and valid projection of marketplace requirements, technology advancement requirements, enabling capital (6-Capital), competency, and commitment management. The later cannot stand on itself without the former, and vice versa.
World Class Standards and Industry Best Practices demand effective operation management systems with discipline application of relevant Mathematical and Statistical Tools to guarantee the validity and assurance of Industry Strategic Business Plan accuracy and comprehensiveness. Market Data Analysis for product strategy, BCG Quadrant Ratio Analysis for sustainability, Value Curves Analysis for competitiveness, Pricing and Profitability Ratio, Economic Return Projection, etc., are examples of relevant Tools.
Industry Advancement Achievement Assurance requires discipline management system of Kaizen and Innovation, where opportunities, issues and problems are dealt with commitment and with zero opportunity for failure. To fulfil this purpose, Mathematical and Statistical Tools are necessary to either capture opportunities or to prevent and correct issues and problems. Six Sigma Tools, Statistical Process Control Tools, Design Engineering – Applied Computational Mathematics, etc., are examples of relevant Tools.
This paper is a speaker broad perspective based on 30 years of assessment and observation experiences of World Class Standards and Industry Best Practices, and Industry Advancement Achievement Assurance subscription and management, and application of Multidiscipline Mathematical and Statistical Tools, of industry in Malaysia. Speaker experiences were based on success story of managing organizations, turn-around projects, industry Strategic Planning development & realization, etc. Comparison analysis and actions recommendation are tabulated between the practices of 30 MNC, 30 local conglomerates with international reach, 30 domestic market businesses, and 20 SME. Top ranking world performing and recognized MNCs in Malaysia are used as the yardstick of comparison, with the assumption that Malaysia’s industry aspires to reach such a statue.