A Neutrosophic Forecasting Model for Time Series Based on First-Order State and Information Entropy of High-Order Fluctuation

A Neutrosophic Forecasting Model for Time Series Based on First-Order State and Information Entropy of High-Order Fluctuation

Author: Hongjun Guan

Publisher: Infinite Study

Published:

Total Pages: 18

ISBN-13:

DOWNLOAD EBOOK

In time series forecasting, information presentation directly affects prediction efficiency. Most existing time series forecasting models follow logical rules according to the relationships between neighboring states, without considering the inconsistency of fluctuations for a related period. In this paper, we propose a new perspective to study the problem of prediction, in which inconsistency is quantified and regarded as a key characteristic of prediction rules. First, a time series is converted to a fluctuation time series by comparing each of the current data with corresponding previous data.


Entropy Application for Forecasting

Entropy Application for Forecasting

Author: Ana Jesus Lopez-Menendez

Publisher: MDPI

Published: 2020-12-29

Total Pages: 200

ISBN-13: 3039364871

DOWNLOAD EBOOK

This book shows the potential of entropy and information theory in forecasting, including both theoretical developments and empirical applications. The contents cover a great diversity of topics, such as the aggregation and combination of individual forecasts, the comparison of forecasting performance, and the debate concerning the tradeoff between complexity and accuracy. Analyses of forecasting uncertainty, robustness, and inconsistency are also included, as are proposals for new forecasting approaches. The proposed methods encompass a variety of time series techniques (e.g., ARIMA, VAR, state space models) as well as econometric methods and machine learning algorithms. The empirical contents include both simulated experiments and real-world applications focusing on GDP, M4-Competition series, confidence and industrial trend surveys, and stock exchange composite indices, among others. In summary, this collection provides an engaging insight into entropy applications for forecasting, offering an interesting overview of the current situation and suggesting possibilities for further research in this field.


A Forecasting Model Based on High-Order Fluctuation Trends and Information Entropy

A Forecasting Model Based on High-Order Fluctuation Trends and Information Entropy

Author: Hongjun Guan

Publisher: Infinite Study

Published:

Total Pages: 15

ISBN-13:

DOWNLOAD EBOOK

Most existing high-order prediction models abstract logical rules that are based on historical discrete states without considering historical inconsistency and fluctuation trends. In fact, these two characteristics are important for describing historical fluctuations. This paper proposes a model based on logical rules abstracted from historical dynamic fluctuation trends and the corresponding inconsistencies. In the logical rule training stage, the dynamic trend states of up and down are mapped to the two dimensions of truth-membership and false-membership of neutrosophic sets, respectively. Meanwhile, information entropy is employed to quantify the inconsistency of a period of history, which is mapped to the indeterminercy-membership of the neutrosophic sets. In the forecasting stage, the similarities among the neutrosophic sets are employed to locate the most similar left side of the logical relationship. Therefore, the two characteristics of the fluctuation trends and inconsistency assist with the future forecasting. The proposed model extends existing high-order fuzzy logical relationships (FLRs) to neutrosophic logical relationships (NLRs). When compared with traditional discrete high-order FLRs, the proposed NLRs have higher generality and handle the problem caused by the lack of rules. The proposed method is then implemented to forecast Taiwan Stock Exchange CapitalizationWeighted Stock Index and Heng Seng Index. The experimental conclusions indicate that the model has stable prediction ability for different data sets. Simultaneously, comparing the prediction error with other approaches also proves that the model has outstanding prediction accuracy and universality.


Forecasting Model Based on Neutrosophic Logical Relationship and Jaccard Similarity

Forecasting Model Based on Neutrosophic Logical Relationship and Jaccard Similarity

Author: Hongjun Guan

Publisher: Infinite Study

Published:

Total Pages: 16

ISBN-13:

DOWNLOAD EBOOK

The daily fluctuation trends of a stock market are illustrated by three statuses: up, equal, and down. These can be represented by a neutrosophic set which consists of three functions—truth-membership, indeterminacy-membership, and falsity-membership. In this paper, we propose a novel forecasting model based on neutrosophic set theory and the fuzzy logical relationships between the status of historical and current values. Firstly, the original time series of the stock market is converted to a fluctuation time series by comparing each piece of data with that of the previous day.


A Refined Approach for Forecasting Based on Neutrosophic Time Series

A Refined Approach for Forecasting Based on Neutrosophic Time Series

Author: Mohamed Abdel-Basset

Publisher: Infinite Study

Published:

Total Pages: 23

ISBN-13:

DOWNLOAD EBOOK

This research introduces a neutrosophic forecasting approach based on neutrosophic time series (NTS). Historical data can be transformed into neutrosophic time series data to determine their truth, indeterminacy and falsity functions. The basis for the neutrosophication process is the score and accuracy functions of historical data. In addition, neutrosophic logical relationship groups (NLRGs) are determined and a deneutrosophication method for NTS is presented. The objective of this research is to suggest an idea of first-and high-order NTS. By comparing our approach with other approaches, we conclude that the suggested approach of forecasting gets better results compared to the other existing approaches of fuzzy, intuitionistic fuzzy, and neutrosophic time series.


Computational Intelligence in Data Mining

Computational Intelligence in Data Mining

Author: Himansu Sekhar Behera

Publisher: Springer

Published: 2019-08-17

Total Pages: 789

ISBN-13: 9811386765

DOWNLOAD EBOOK

This proceeding discuss the latest solutions, scientific findings and methods for solving intriguing problems in the fields of data mining, computational intelligence, big data analytics, and soft computing. This gathers outstanding papers from the fifth International Conference on “Computational Intelligence in Data Mining” (ICCIDM), and offer a “sneak preview” of the strengths and weaknesses of trending applications, together with exciting advances in computational intelligence, data mining, and related fields.


Algorithm Selection for Edge Detection in Satellite Images by Neutrosophic WASPAS Method

Algorithm Selection for Edge Detection in Satellite Images by Neutrosophic WASPAS Method

Author: Romualdas Bausys

Publisher: Infinite Study

Published:

Total Pages: 23

ISBN-13:

DOWNLOAD EBOOK

Nowadays, integrated land management is generally governed by the principles of sustainability. Land use management usually is grounded in satellite image information. The detection and monitoring of areas of interest in satellite images is a difficult task. We propose a new methodology for the adaptive selection of edge detection algorithms using visual features of satellite images and the multi-criteria decision-making (MCDM) method. It is not trivial to select the most appropriate method for the chosen satellite images as there is no proper algorithm for all cases as it depends on many factors, like acquisition and content of the raster images, visual features of realworld images, and humans’ visual perception. The edge detection algorithms were ranked according to their suitability for the appropriate satellite images using the neutrosophic weighted aggregated sum product assessment (WASPAS) method. The results obtained using the created methodology were verified with results acquired in an alternative way—using the edge detection algorithms for specific images. This methodology facilitates the selection of a proper edge detector for the chosen image content.