LEE- Hesaplamalı Bilim ve Mühendislik-Yüksek Lisans
Bu koleksiyon için kalıcı URI
Gözat
Konu "hiperspektral görüntüleme" ile LEE- Hesaplamalı Bilim ve Mühendislik-Yüksek Lisans'a göz atma
Sayfa başına sonuç
Sıralama Seçenekleri
-
ÖgeAugmented superpixel based anomaly detection in hyperspectral imagery(Graduate School, 2024-07-01) Gökdemir, Ezgi ; Tuna, Süha ; 702211005 ; Computational Science and EngineeringThe detection of anomalies in hyperspectral images depends on several factors. Here, the spatial proximity of anomalies and confusion in the background image can create a bottleneck at the point of anomaly detection. Hyperspectral images are tensor data, in which each pixel contains both spatial and spectral information. These complex data structures pose significant challenges for traditional anomaly detection methods, which often struggle to account for the intricate relationships between the different spectral bands. In this thesis, a method called "Augmented Superpixel (Hyperpixel) Based Anomaly Detection in Hyperspectral Imagery" is proposed. This method aims to enhance the anomaly detection by leveraging advanced dimensionality reduction and segmentation techniques. Our approach begins by reducing the three-dimensional HSI data using methods such as high-dimensional model representation and Principal Component Analysis. This step simplifies the data while preserving critical spectral and spatial information. By capturing the most significant components of the data, these techniques help eliminate noise and irrelevant details, thereby making the subsequent analysis more focused and effective. We then applied segmentation methods such as Simple Linear Iterative Clustering and Linear Spectral Clustering to divide the image into distinct regions known as superpixels. Each superpixel is augmented with its first-order neighbors to form hyperpixels, which provide a richer context for anomaly detection. The augmentation process ensures that the local context is considered, thereby enhancing the ability to detect subtle anomalies that may be missed when examining individual superpixels in isolation. This neighborhood information is crucial for accurately identifying the boundaries of anomalies and distinguishing them from normal variations in the data. Finally, we applied the Local Outlier Factor algorithm to these hyperpixels to identify the outlier points that signify anomalies. The capability of the Local Outlier Factor to evaluate local density deviations enables it to accurately identify anomalies, even in densely populated or intricate backgrounds. The combination of these techniques ensures comprehensive and precise analysis that can handle the diverse characteristics of hyperspectral datasets. The proposed algorithm was tested using various hyperspectral image datasets and demonstrated good performance in detecting anomalies. By integrating dimensionality reduction, segmentation, and anomaly detection techniques, this method effectively manages the complexity of the hyperspectral data. This comprehensive approach allows for accurate identification of anomalies, even in challenging conditions where anomalies are closely packed or the background is complex. Through rigorous experimentation, the algorithm demonstrated robustness and reliability, making it a promising tool for hyperspectral image analyses. Its versatility and high accuracy across different datasets underline its potential for broad application in fields such as remote sensing, environmental monitoring, and urban planning. The ability to adapt to various anomaly characteristics and dataset structures makes this method a valuable addition to the toolkit for hyperspectral image-analysis techniques.
-
ÖgeExploiting optimal supports in enhanced multivariance products representation for lossy compression of hyperspectral images(Graduate School, 2024-01-31) Şen, Muhammed Enis ; Tuna, Süha ; 702211008 ; Computational Science and EngineeringData serves as an irreplacable foundation of modern society as it is the core element of numerous fields such as technological innovations, scientific advancements, and economic decisions. It enables insights into domains of knowledge and experience, assistance with decision-making tasks, and predictions for future outcomes. It has progressed since the very beginning of time from being knowledge and information kept in physical formats like carvings on cave walls and events conserved as inscriptions, to evolving into such mathematical structures that can be obtained by any interaction made through current technological devices like interactions on social media and observations acquired through the use of advanced tools owing to technological advancements. Data transforming into more detailed and complex structures poses efficiency challenges that entails computational methods which can handle the processing and handling of such structures. For the data handling needs, many methods have been proposed and have been in use thus far. Each has its advantages as well as some drawbacks in the form of problems in either certain limitations or computational complexity issues. Some alternative workarounds have been suggested for these kinds of issues such as embracing an iterative approach rather than employing direct solutions and techniques have been customized to fit specific workflows. Moreover, some innovative approaches operate on representations of data that have undergone compression and transformation, rendering them into more easily processable structures. An important aspect of these practices is the preservation of the data's characteristic features. Compression methods execute this procedure in unique ways like exploiting the eigenvalues and eigenvectors or utilizing singular values. These techniques not only streamline the processing of data but also contribute to the efficiency and accuracy of analyses by retaining characteristic features throughout the compression process. In the field of data processing, an understanding of these diverse methodologies proves convenience in selecting the most effective solutions for the application under consideration. Hyperspectral imaging is an area that requires such computational techniques to process the collected data due to its high dimensional workflow. It outputs 3-dimensional mathematical structures where the first two dimensions correspond to the spatial attributes of the captured area while the third dimension captures the spectral information with respect to the obtaining device's capacity of retrieving bands. As a result, the fibers in the data's third dimension relate to spectral signatures that empower the identification of objects and materials. The ability to analyze these spectral data opens doors to multiple useful applications in numerous areas like remote sensing, agriculture, medical imaging, archaeology, and urban planning. Recent studies in computational sciences for high-dimensional structures have adopted new methods that improve the overall processing performance and make more in-depth analyses possible. Considering the relational design in its third dimension, the High Dimensional Model Representation (HDMR) is a technique that hyperspectral imaging can benefit deeply thanks to its decorrelation properties. The aim of HDMR is to represent multivariate functions in terms of lower dimensional ones. But thanks to the way it was defined, this technique is also applicable on tensors, hence, it can be used to decompose a given tensor in terms of less dimensional entities where each element refers to the attitude of a certain combination of dimensions. This ability of HDMR addresses the decorrelation of each dimension of the given data. The decorrelation procedure enables reducing the noise and removing artifacts while preserving the high-frequency components. Hence, it can be said that HDMR is a suitable compression technique for high-dimensional data with strong relations on individual axes such as hyperspectral images. HDMR employs a set of weights and support vectors to represent data, consequently, necessitating calculation steps. These entities are either assigned certain values or arranged using techniques like Averaged Directional Supports (ADS) but the process of calculating the optimal entities can also be optimized by employing iterative methods such as the Alternating Direction Method of Multipliers (ADMM) where the entailments of HDMR could be used as constraints of ADMM. A sub-method of HDMR which is called the Enhanced Multivariance Products Representation (EMPR) specializes in optimizing the representation by focusing on the support vectors. The weights are assumed to be constant values or scalars and the support vectors are managed by the previously mentioned calculation techniques. As these methods employ the main data for the calculation of the support vectors, they introduce a more robust method EMPR compared to HDMR. Iterative approaches like ADMM can assist in properties of these support vectors such as enforcing sparsity for better representions and improving denoising capabilities. This thesis work explores the hyperspectral imaging area and proposes a new perspective on the decomposition methods by bringing a tensor-based iterative approach to EMPR through the use of ADMM. The study compares the proposed method's performance and efficiency with some other well-known tensor decomposition techniques, namely CANDECOMP/PARAFAC Alternating Least Squares (CP-ALS) and Tucker Decomposition (TD), while also comparing the results to EMPR's regular application by ADS. Multiple tests are performed on hyperspectral datasets which are 3-dimensional and as a result, the proposed technique is arranged to be applicable on any 3-dimensional tensor especially data that can benefit the decorrelation properties of EMPR. As a result of EMPR, the relations in each dimension and the combinations of these dimensions are acquired through the support vectors. Results from multiple metrics prove that the proposed method performs similarly to the mentioned tensor decomposition methods for specified ranks and the decorrelated dimensions are successfully represented by the 1-dimensional EMPR components. Tests also employ the 2-dimensional components to reveal the effect on final representations with comparisons to CP-ALS and TD aiming for multiple rank options. The key point of this proposed technique lies in EMPR's superior decorrelation ability. Not only does it demonstrate the capability of reconstructing high-dimensional data with similar accuracy but it also highlights its potential to reduce noise and artefacts in the process. These results are particularly promising for any lossy compression task including Cartesian geometry utilizing tensor decomposition techniques where accurate and efficient data processing is paramount. Furthermore, this performance advantage paves the way for advancements in lossy compression techniques, enabling researchers and practitioners to gain more precise insights from data.