LEE- Bilgi Güvenliği Mühendisliği ve Kriptografi-Yüksek Lisans
Bu koleksiyon için kalıcı URI
Gözat
Başlık ile LEE- Bilgi Güvenliği Mühendisliği ve Kriptografi-Yüksek Lisans'a göz atma
Sayfa başına sonuç
Sıralama Seçenekleri
-
ÖgeA new public key algorithm and complexity analysis(Graduate School, 2023-06-23) Çağlar, Selin ; Özdemir, Enver ; 707201029 ; Cybersecurity Engineering and CryptographyWith the development of technology, many processes have begun to digitize. As a result of this digitalization, digital communication has become inevitable in our lives. Digital communication is faster and easier to access than traditional communication methods. Especially with the Covid-19 pandemic, the contribution of digitalized processes to our daily life has been visibly felt. As a result of digitization, a lot of data belonging to different data classes has been transferred to the digital environment. The transfer of information to digital media has brought about a change in the methods of storing and using data. At this point, the importance of issues such as data privacy and security has increased and the concept of secure digital communication has come to the fore. Secure digital communication deals with the provision of cornerstones of security such as confidentiality, integrity, and authentication while transferring data over digital channels. Confidentiality is the process of preventing unauthorized parties from viewing sensitive data and ensuring that only those who have been given permission can do so. This can be achieved through data encryption, access controls, and secure channels. Integrity refers to the assurance that data remains unaltered and uncorrupted during transmission, storage, and processing, ensuring that the data can be trusted and relied upon. Techniques such as digital signatures and hash functions can be used to verify the integrity of data. Verifying a user's or a device's identity when they want to access data or services is referred to as authentication. This is typically achieved through the use of digital signatures, which are cryptographic techniques that provide a way to verify the authenticity of data by verifying the identity of the sender. Together, these three principles form the foundation of secure communication. When sharing data in a public environment, the data to be transferred must be protected. In other words, there is a need to ensure that the principle of confidentiality, which is the main starting point of this study, can be provided. Cryptography, which enables encryption structures, is used to ensure confidentiality. Symmetric key cryptography, which is more efficient in terms of key length and cryptographic operation and uses the same key in encryption and decryption processes, is widely used in encryption processes. In symmetric key cryptography, the party that encrypts and decrypts the data must use the same cryptographic key. Sharing of this cryptographic key must be done securely between the parties. Asymmetric key cryptography is used at the point of sharing the symmetric key, especially in processes that are established in a public environment and where there is no opportunity for the parties to directly share keys physically. Symmetric key cryptography is based on the use of a key pair consisting of a public and private key. A public key is a key that can be shared publicly with the parties used to send encrypted data. The private key, on the other hand, is the key used in decrypting the sent encrypted data, which the owner of the key pair must keep securely. Asymmetric key cryptography is used to provide confidentiality and authentication. The fact that it can also provide authentication is a factor that increases security in key exchange processes. After the parties verify each other cryptographically at the key exchange, asymmetric key cryptography provides an environment for sharing the symmetric key to be used to secure the communication. The RSA algorithm is one of the oldest and most widely used asymmetric key algorithms. The security of the algorithm is based on the difficulty of factoring integers. In the RSA algorithm, the public key modulus is equal to the product of two large prime numbers of the same size. Revealing these two prime numbers is enough to break the algorithm. At the same time, there is the possibility of returning the message without factoring from the encrypted data. This is called the RSA problem. Research studies have shown that there may be an easier way to return a message from encrypted data without factoring. If an effective method is developed for the RSA problem, the security of many RSA-based systems will be under threat. In this thesis, a new public key algorithm, which can be an alternative to the RSA algorithm, is proposed in the case of solving the RSA problem. This algorithm is based on the use of nodal curves and the group structure is different from the RSA algorithm. In the proposed algorithm, the discrete logarithm problem is thought to be harder, since the group structure in which the algorithm works is based on polynomial arithmetic and is also inspired by elliptic/hyperelliptic curves. At this point, it is assumed that the proposed new algorithm may be more durable to the problem in the RSA algorithm. At the same time, a new group operation algorithm, which is an addition algorithm, is presented by modification of the Mumford Representation and Cantor Algorithm in order to perform the group operation on the nodal curves. The performance comparison of the group operation presented on the nodal curves and the Cantor algorithm has been made. Compared to the Cantor algorithm, the presented new group operation was found to be more effective. In addition, the proposed algorithm has a probabilistic behavior. In other words, even if the data to be encrypted does not change, a structure is presented that can enable the encrypted data to be formed differently. The RSA algorithm has a deterministic behavior, additional padding is needed to produce different encrypted results from the same data. Since the proposed public key algorithm is based on polynomial arithmetic, there is no performance advantage compared to the RSA algorithm. We can state that there is a trade-off between security and performance. In order to show the practical applicability of the presented new solution, a performance comparison with the RSA algorithm has also been made. The performance problem is caused by the exponential increase in the secret key with the increase in the degree of the nodal curve used. In other words, it has been seen that the algorithm proposed in the decryption phase is slower than the RSA algorithm. However, since the decryption process in asymmetric key cryptography is generally not performed by individual users, it is thought that powerful servers will not be affected by this performance problem. During the tests, the SageMath library and the Python programming language were used.
-
ÖgeAnalyzing individual data for insider threat detection(Graduate School, 2024-07-16) Yakar, Burak ; Özdemir, Enver ; 707211003 ; Cybersecurity Engineering and CryptographyInsider threats have been recognized as one of the most significant risks in cybersecurity. Research indicates that a majority of security breaches are caused by attacks or vulnerabilities originating from within the organization. Even with the most secure systems, as long as humans are part of the system, absolute security cannot be guaranteed. Technology is everywhere in our lives. People use smartphones, smartwatches, computers, and various other smart devices, all of which collect data to some extent. This data collection occurs not only on a personal level but also across businesses of all sizes. As businesses invest heavily in their operations, they need to secure their assets. To protect these assets, businesses invest in security measures. While some of these investments are physical precautions against physical risks, others are related to cybersecurity to mitigate cyber risks. Even if businesses build the best IDS (Intrusion Detection System) or IPS (Intrusion Protection System), there may still be ways for attackers to infiltrate and sneak in. This is because humans are the weakest component of any ICT (Information and Communications Technology) security system and present the greatest risks and threats to a company, organization, or system. Insider threats are cybersecurity threats that originate from authorized users, such as employees, business partners, contractors, vendors, and former employees. Misusing legitimate user credentials and account hijacking are some methods to carry out these intentions. These actions are not necessarily all intentional; some may be unintentional. However, as a result of these actions, the confidentiality, integrity, and availability of systems and data are compromised. The cost of these actions can cause significant expenses that most SMEs (small to medium-sized businesses) cannot afford. This study focuses on defining insider threats, mitigating security risks leading to insider vulnerabilities, and preventing insider threats by analyzing individual data using the random forest algorithm. The aim of this study is to find a method to detect malicious intentions and prevent potential attacks before they occur.
-
ÖgeDesign and analysis of privacy-preserving and regulations-compliant central bank digital currency(Graduate School, 2024-07-12) Doğan, Ali ; Bıçakcı, Kemal ; 707211012 ; Cybersecurity Engineering and CryptographySignificant advances has been made in the field of Central Bank Digital Currency (CBDC) in the last five years. These advances are available not only in the academic world but also in central banks. Currently, more than 130 countries continue their CBDC studies at research, pilot and proof of concept levels. The increased interest in CBDC can be attributed to various factors such as the increasing progress in digital payment technologies, the widespread use of cryptocurrencies in the digital money market and the advantages brought by this technology. In addition to these advantages, there are challenges and problems that have not yet been resolved in order for CBDCs to reach the maturity level. One of these problems is the conflict between efforts to protect the privacy of digital currency users and the compliance mechanisms introduced by states to ensure financial stability and social order. States try to prevent and monitor financial crimes through regulations such as combating dirty money and preventing financing of terrorism. However, such regulations could lead to citizens' lives being completely monitored in the transition to digital money. In addition to this conflict, a significant part of the existing CBDCs are operated on a blockchain-based system. Due to the transparent structure of the blockchain, parties included in the network can track and monitor users' transactions, but transaction privacy is ignored. In the present study, solutions to the mentioned privacy problems are introduced with cryptographic techniques such as zero knowledge proofs, threshold cryptography, and homomorphic encryption. In the proposed system, the user's balance is kept homomorphically encrypted in the blockchain. To perform a transfer transaction, the sender encrypts the amount he wants to transfer with his own public key, the receiver's public key, and the regulators' public key. The sender then creates a zero-knowledge proof that the amount is the same in all three ciphertexts. Since the transaction is processed through encrypted texts, the user must create a range proof that the balance he has is sufficient. After creating all the proofs and transmitting them to the blockchain, the nodes confirm the transaction and the user's balance is homomorphically reduced via the ciphertext and the recipient's balance is increased. In any suspicious case, the user's transaction history can be traced back by government institutions called regulators. However, threshold encryption was used to ensure that this control was not left to the initiative of a single institution. These institutions must reach a consensus and after reaching the threshold value, they can see the transaction details. Additionally, techniques have been suggested so that commercial banks can continue their services in this system.
-
ÖgeDetecting malicious activity inside of the network(Graduate School, 2023-12-20) Kumbasar, Ayşenur ; Özdemir, Enver ; 707201002 ; Cybersecurity Engineering and CryptographyIn today's world with the global development and digitalization, applications and services used in banking and finance sectors, as in all sectors, have started to adapt to the online world quickly. The increase in the rate of transition to the Internet environment shows that the issue of security is becoming more and more important and serious for banks and customers. Companies serving in the financial and banking sectors are an attractive target for cyber attackers in terms of damage to the target system and data obtained by attackers. The protection of information systems containing important and sensitive business and customer information, such as databases, servers, computers, networks used, is of high importance. In the same way, providing a secure and robust online communication environment in the services provided to customers and ensuring that data is transmitted in reliable environments is one of the most important elements in the banking sector Banks are also making major investments in security systems to ensure secure communication and the protection of personal and business information and documents as a precaution against this increasing number of cyber attacks. With these systems, they have the potential to prevent such attacks by detecting and responding to abnormal and unauthorized activities. However, research shows that the majority of cyber attacks are carried out by insiders. Most security products in use focus on external threats. However, if the attacker is a person working within the organization, these systems may be insufficient to detect such activities. The inside attacker has legitimate access privileges to sensitive data, systems, networks that outsiders do not have. It is difficult to predict and prevent as the malicious user inside follows legitimate paths and methods. Since the systems have detailed information about the internal organization such as the corporate network, they can misuse sensitive and confidential data and cause irreversible damage to the organizations by creating great losses. Therefore, it can be said that the cost of damage caused by internal threat is much higher than external threat. This study focuses on detecting insider threats by monitoring users with a behavioural focus. By examining normal user behaviour and malicious user behaviour with SVM, KNN and Random Forest algorithms, it is aimed to detect internal threats and help minimize the damage that can be done to the institution with preventive controls that will come with it.
-
ÖgeDistributed anomaly-based intrusion detection system for IoT environment using Blockchain technology(Graduate School, 2022-02-04) Hejazi, Nouha ; Özdemir, Enver ; 707191006 ; Cybersecurity Engineering and CryptographyThe IoT world is growing rapidly. One of the most important challenges facing the commercialization of IoT-related innovations is preserving system security and privacy of users' information as well as achieving high acceptance levels. Unfortunately, IoT inherits security threats from its enabling technologies and adds many constraints on any applicable security solution because of the special characteristics of IoT systems which make preserving the system's security more challenging. This increases the landscape of threats and makes the system vulnerable to inside as well as outside attacks. However, IoT networks are usually implemented on a vast scale which makes them produce a huge amount of data during communication. This fact makes machine learning a promising solution for securing IoT systems. This huge data can be analyzed and used to detect abnormal behavior or anomalies. Nevertheless, according to resource and power constraints that IoT devices operate in, it is vital to reduce the needed storage and processing power needed for the detection algorithm or to propose an architecture that distributes the load over network nodes. Instead of implementing the Intrusion Detection System in a centralized way and handling data from the whole IoT system - which makes the system exposed to attacks and create single point failure or puts it at risk if the central server is compromised - distributed collaborative architecture could be used to take advantage of the massive deployment of IoT devices. The collaborative intrusion detection systems have better knowledge of their protected environments and provide a solution for the applications that are sensitive to user privacy. In this work, we are going to introduce a new security solution for intrusion detection in IoT systems. Our proposed solution utilizes distributed collaborative architecture trying to take advantage of IoT structure and overcome its limitations. A federated learning method is proposed in this thesis. Using the private dataset, the local model gets trained by each node. Then, the parameters of its local model are shared with other nodes for the sake of generating a better global model. This thesis proposes utilizing a Generative Adversarial Network (GAN) for the purpose of detecting anomalies. The model will be trained on the normal system behavior and let the generator mimic attacks while the discriminator detects anomalies based on their difference from the normal behavior. This technique could offer a solution for the problem of limited data points that represents malicious behavior. Additionally, this paper suggests employing an autoencoder for feature extraction. There are four main purposes for doing so. The first is to improve the efficiency of the GAN training process by lowering system congestion. The second is minimizing the sample size required. Similarly, the third purpose is to make the training and classification process lighter and easier. Finally, it can also conceal the data for scenarios where the device shares its data along with its model's parameters to gain trustworthiness. On the other hand, our solution will employ data sharing and mutual trust between system devices using blockchain technology. The collaborated devices share their model's parameters over the blockchain. In this way, they can compute the general global model by averaging all shared models or they can check their results using their neighbors' models. Furthermore, in distributed peer-to-peer IDS network alert exchange between the different IDS nodes is vital to detect anomalies and determine the trustworthiness of the nodes of the network. Additionally, system devices might share an encoded version of their data over the blockchain along with their models' parameters to enable other devices to verify the detected intrusion. To determine the trustworthiness of a node, a calculation can be initiated based on the fulfillment of received alert-related information. Then, the blockchain registry would include the alerts generated by each IDS node. Consequently, the collaborating nodes would depend on the consensus protocol to judge the validity of the alerts before inserting them on the blockchain. However, since each IoT system might have a different structure and characteristics according to its functionality and the circumstances it is implemented in, different IoT systems might apply our suggested solution with different settings. Also, according to the limitations that faced our research in terms of time and research equipment we are going to present a general structure for the proposed system and discuss it from security aspects that govern collaborative distributed IDSs.
-
ÖgeGenerating synthetic data for user behavior based intrusion detection systems(Graduate School, 2024-07-16) İbrahimov, Ughur ; Özdemir, Enver ; 707211009 ; Cybersecurity Engineering and CryptographyIntrusion detection systems are at a critical point in the effort to mitigate cyber vulnerabilities. While malicious actors are increasing day by day, the demand for multifunctional IDS models constantly increases. Since data plays the most crucial role in all cybersecurity measures, obtaining data is really important while developing these security precautions. At this point, synthetic data provides unique contributions to overcoming the problem of data scarcity. This thesis examines the intrusion detection concept, necessity of synthetic data in cybersecurity and synthetic data generation methods. The analyse provides information about relationship between synthetic data and intrusion detection systems, application process of synthetic data and privacy topics while generating and implementing artifical data for cybersecurity measures. After a detailed analyse, we decide generation method and tool for the purpose of this thesis. Since there are various methods and techniques to produce synthetic data for different purposes, we need to choose the right modeling and method for our work. Synthetic data producing methods include machine learning approaches like generative adversarial networks (GAN), variational autoenconders (VAE) furthermore, apporaches like simulation, interpolation and extrapolation, statistical modelling and more others. In this thesis, we generate synthetic data that shows daily behavior of the user who works as information technologies support technician and deals with tickets. We use Python language libraries are implemented for technical side to produce manufactured data. Moreover, scenario was developed to establish a synthetic dataset that is close to real life incidents as possible. Constants like ticket identifications, ticket types, action types are clearly defined in order to generate balanced synthetic data. One of the necessities of synthetic data usage in different industries is it being constructed in a balanced shape. Ticket types are defined as task, bug, support, question, feature, then we defined actions that contains work on ticket, reassign ticket, attach file to a ticket, and others. Although approximately 35,000 movements were created over a two-week period, the duration of the experiment could be extended over a longer period of time for a more realistic distribution in later developments. We also decided to make the synthetic data show actions between 9 A.M and 5 P.M which are work hours. The time spent is calculated from the difference between randomly assigned start and finish times between these hours. xxii Generated data is stored in Excel file, which contains approximately 35000 lines. It is possible to change the amount according to the purpose by making changes in the code. The statistical distribution of the result is shown in histograms at the end.
-
ÖgeImplementation and analysis of the secret key generation algorithm using software defined radios(Graduate School, 2024-06-27) Alper, Ertuğrul ; Özdemir, Enver ; 707211016 ; Cybersecurity Engineering and CryptographyAs the use of wireless communication systems increases, their security has become a critical focus due to various technological advancements. Given the diversity of applications and technologies, it is not possible to address the security concerns of all wireless systems in a single study. Therefore, this thesis presents the design, analysis, and implementation of a cryptographic secret key generation algorithm within a two- and three-node distributed wireless system featuring full-duplex multiple access channels, aimed at improving security in wireless communications. In addition, the thesis includes a comprehensive review of the literature on multiple access channels and computational techniques, discussing the findings in detail. In the following chapters of the thesis, wireless communication systems are explained, and then multiple access channels are examined in detail. In this section, especially wireless full-duplex multiple access channels (W-FMAC) are emphasized, and this technology is used in simulations and implementations. In addition, examples of wireless half-duplex multiple access channels (W-HMAC) and non-orthogonal multiple access channels (NOMA) are discussed comprehensively with their usage areas. Afterwards, function computation (FC) techniques are defined which compute signals while transmitting them in the air and providing meaningful information to the receiver. In this section, it is mentioned how these calculations can be made in the air and what kind of designs should be made in the sender and receiver nodes. Afterwards, it is emphasized that the analog function computation (AFC) technique is used in this project and pre-processing and post-processing functions are used in the transmitting antenna and receiving antenna recursively. In addition, digital function computation (DFC) is also examined in this section and compared with the AFC technique. This valuable information provided by wireless communication is critical in simulating and implementing the cryptographic key generation algorithm described later in the thesis, and two- and three-node test systems are created on this basis. In the following part of the thesis, the cryptographic key generation algorithm, which is the main theme of the study, is discussed in detail. First, using the wireless full-duplex multiple access channels technique, a system consisting of N users is designed and presented with the system model. Then, the AFC technique is used, which is required for the implementation of the secret key generation algorithm, and the processing functions are explained. In this section, it is emphasized that the secret keys chosen by the nodes are Gaussian prime numbers, and it is proved that those prime numbers form the main basis of the system. Afterwards, the channel model is created in the simulation environment, and the channel parameters are shown. Subsequently, error models are created to measure the success of the secret key generation algorithm implemented in the test environment. The basis of these error models is determined as the distance between users and the channel estimation coefficients, and the success of the system is measured by performing Monte Carlo simulations in the test environment. The detailed explanations of the results are then given in the performance evaluation section. Afterwards, the results obtained are discussed and the ideal values of the system parameters are shared to improve the implementation of the algorithm. Furthermore, the term software defined radio (SDR) is explained, and its abilities and usage areas are shown. GNU Radio, the most common open source software toolkit used to program SDRs, is mentioned. Then, the platforms compatible with GNU Radio, the installation process, and the creation of software blocks are investigated. This discussion is enhanced with sample designs and flowgraphs. In the following section, Universal Software Radio Hardware (USRP), which is the hardware combination of software-defined radios, is discussed, and the hardware architecture is explained. Then, different Ettus USRP devices are compared according to various factors and their pros and cons are presented. In addition, it is emphasized that the USRP B210 model is used in this study. In addition, it is described how to use the USRP receiver and source blocks in GNU Radio and what parameters need to be set. In the next stage of the thesis, based on the basic information described in the previous sections, it is mentioned how the secret key generation algorithm is implemented using an SDR. In this section, first, the software and hardware required to perform this operation are shown. Then, it focuses on how the secret key is transferred for a two-user system and how it is reconstructed in the receiver node. This follows a detailed diagram of the transmitter and receiver systems created on GNU Radio, the flow chart, all the parameters used, and the software blocks created. Finally, the secret key value obtained in this study is compared with the theoretically calculated secret key, and error calculations are made. In the final section of the thesis, a summary of all these operations is provided and the practical implementation of the study is highlighted once more. Finally, the thesis outlines the scope of subsequent research, presented as an extension of this work, and identifies the areas that will receive further development.
-
Ögeİkili kuadratik form ̇ile grup kimlik doğrulaması(Lisansüstü Eğitim Enstitüsü, 2023-01-31) Aksoy, Filiz ; Özdemir, Enver ; Özer, Özen ; 707191004 ; Bilgi Güvenligi Mühendisli ˘ gi ve KriptografKriptoloji, dijital ortamda taraflar arasında güvenli iletişimin gerçekleşmesi için gerekli algoritma ve protokol dizaynını amaç edinen bilim dalıdır. Sanal ortamdaki herhangi bir veri akışının güvenliği kriptografik temel taşlar ile sağlanır, Günümüzde teknolojinin gelişmesi ve internetin yaygınlaşması ile bilgi paylaşımı da kritik bir önem kazanmakta ve güvenli bilgi paylaşımı için sürekli yeni modeller geliştirilmektedir. Kriptografi biliminin amacı yalnızca mesajları şifreleme ve deşifre algoritmaları geliştirmek değil, aynı zamanda bilgi güvenliği gerektiren gerçek dünya sorunlarını çözüme kavuşturmayı sağlamaktır. Diğer bir deyişle sanal ortamda akan verilerin güvenli transferini sağlayacak uygun yapıtaşları hazırlamaktır. Bu yapıtaşların uygunluğu birçok faktöre bağlıdır. Mevcut donanım yapısına ve kullanıcıların beklediği veri akış hızına uygunluğu en öncelikli hedefler arasındadır. Dijital ortamdaki haberleşmenin güvenliği önceden belirlenen dört hedefin sağlanması ile mümkün olabilmektedir. Bu hedeflerin ilki mesajın gizliliği olarak ifade edilen gizlilik (confidentiality) kavramıdır. Mesajın karşı tarafa güvenli bir şekilde iletilmesi için tasarlanan algoritmaların ana amacı mesajın üçüncü taraflar tarafından okumasını engellemektir. Dijital ortam herkes tarafından görülebilir kabul edilmektedir. Dolayısı ile yalın halde gönderilecek bir mesaj herkes tarafından okunabilecektir. Mesajın sadece önceden belirlenen alıcılar tarafından okunabilmesi güvenli haberleşmenin en önemli öğelerinden biridir. Bir diğer amaç ise veri bütünlüğü (data integrity) yani mesajın içeriğinin değişmesini önlemektir. Mesajın içeriği iletim esnasında oluşabilecek hatalardan veya araya giren kişilerden kaynaklı değişikliğe uğrayabilmektedir. Bu tür manipülasyon ve değişimleri engellemek için genellikle özet (hash) fonksiyonları kullanılmaktadır. Güvenli haberleşmenin sağlaması gereken amaçlardan bir diğeri ise kimlik doğrulama (authentication), yani mesajın kaynağının ve alıcının doğrulanmasıdır. Bunun için mesajı oluşturan kişi ve zaman damgası gibi bilgileri içeren dijital imza gibi yöntemler kullanılmaktadır. Son olarak ise gönderilen mesajın gönderici tarafından inkar edilememesi (Non-repudiation), yani mesajı gönderenin mesajı kendisinin göndermediğini iddia edememesidir. Dijital imza gibi yöntemler kimlik doğrulaması ile birlikte mesajı gönderenin inkar etme durumunu da ortadan kaldırmaktadır. Güvenli haberleşmenin en önemli sac ayağı gizlilik simetrik kriptografik algoritmalar ile sağlanmaktadır. 1974 ten günümüze kadar nerdeyse tüm dijital haberleşme kanalları standart olan simetrik anahtarlı algoritmaları kullanmaktadır. Simetrik anahtarlı kriptografik sistemlerde gönderen ve alıcı taraflarının her ikisi de aynı anahtara sahip olmak zorundadır. Her ne kadar gizlilik standart simetrik şifreleme metotları ile sağlanıyor olsa da, tarafların aynı anahtarı elde etmesi en önemli problem halini almaktadır. Tarafların anahtar paylaşımı yapmadan önce birbirlerinin kim olduklarını tespit etmesi yani kimlik doğrulama yapması beklenmektedir. Kimlik doğrulama sonrasında anahtar değişimi yapılmaktadır. Kimlik doğrulama ve anahtar değişimi algoritmaları şu ana kadar sadece bir alıcı ve bir göndericinin olduğu ortamları göz önünde bulundurarak dizayn edilmiştir. Fakat günümüzde artık haberleşme birebir değil onlarca hatta binlerce aletin aynı anda veri alış verişi yaptığı iletişim sistemlerinden oluşmaktadır. Mesela nesnelerin interneti (Internet of Things - IoT) gibi teknolojilerin de gelişmesi ile hem aynı anda bir çok aynı amaç için kullanılan aletler hızlı kimlik doğrulaması ve anahtar değişimi yapması gerekmektedir. Veri akışı her bir aletten diğerlerine gittiği için ortamda bulunan onlarca belki de binlerce aletin her biri için kimlik doğrulaması yapması ve anahtar paylaşımı yapması beklenmektedir. Ayrıca, iletişim ağına dahil olan tüm cihazların teknik kapasitelerinin aynı olmadığı düşünüldüğünde düşük işlemcili cihazları da destekleyen bir modele ihtiyaç günden güne artmaktadır. Daha fazla cihazın veri akış trafiğine dahil olacağı beklenildiğinden, kısa süre içerisinde çoklu kimlik doğrulama ve çoklu ortam için etkin anahtar değişimi algoritmalarının daha fazla ihtiyaç haline geleceği açıktır. Bu çoklu ortamlar için etkin kimlik doğrulama algoritması geliştirilecektir. Bunun yanında pratikte kullanılabilecek anahtar belirleme algoritmasında sunulacaktır. Sunulan algoritmaların performans değerleri analiz edilecek ve kripto analizleri yapılarak güvenlik parametreleri sunulacaktır. Dizayn edilen algoritmalarda yeni bir matematiksel aygıt kullanılacaktır. Bu aygıt uzun zamandır sayılar teorisi alanında bilinen ikili kuadratik formlardır. İlk bölümde kriptografiye kısa bir giriş yapılarak, simetrik ve asimetrik anahtar algoritmalarının temel yapıtaşları ve bu algoritmaların güvenilirliğinden örneklerle bahsedilecektir. Mevcut kriptografik yapıtaşlarının güvenli haberleşmede istenilen özellikleri sağlamada nasıl kullanıldığı örneklendirilecektir. Bu bağlamda elektronik posta servislerinin güvenliğini sağlayan en önemli protokollarden PGP uygulamasından bahsedilecek ve yapıtaşların etkin bir şekilde kullanımına örnek verilecektir. İkinci bölümde ise, birebir kimlik doğrulama ve grup kimlik doğrulama detaylı olarak anlatılacaktır. Daha sonra son zamanlarda çoklu kimlik doğrulama ve çoklu anahtar paylaşımı için dizayn edilmiş grup kimlik doğrulaması üzerine yapılan çalışmalardan bahsedilecektir. Üçüncü bölümde sunacağımız grup kimlik doğrulama ve anahtar değişimi algoritmaları için matematiksel yapıtaşları olan ikili kuadratik formlardan detaylı bahsedilecektir. İkili kuadratik formlar, primitif formlar, pozitif belirli formlar, kuadratik formların denkliği, denklik sınıfı, indirgenmiş formlar bu bölümde anlatılmaktadır. Üçüncü ve son bölümde ise, ikili kuadratik form ile grup kimlik doğrulama için önerilen modelin detayları ve teorik performansının diğer grup kimlik doğrulama modelleri ile karşılaştırılması yer almaktadır.
-
Ögeİnternet bankacılığında kimlik doğrulama yöntemleri(Lisansüstü Eğitim Enstitüsü, 2022-02-04) Hezer, Şahadet Gülşah ; Özdemir, Enver ; 707191008 ; Bilgi Güvenliği Mühendisliği ve KriptografiGünümüzde dijitalleşme büyük ölçüde artmıştır. Buna bağlı olarak gün geçtikçe yaptığımız tüm işler internet üzerinden gerçekleştirilebilmektedir. Özellikle, finansal alanda yaptığımız internet üzerindeki işlerde maddi açıdan kayıp yaşamamak adına, işlemi yapabilmek için kullandığımız tüm sistemlere güven duyma ihtiyacı ortaya çıkar. Bu açıdan, işlemi başlattığımız ilk adım olarak kimlik doğrulama süreci önem arz etmektedir. Finansal alanda, kimlik doğrulama adımı genel olarak bankacılık hesabına girişte ve alışveriş işlemlerinde yer almaktadır. Maddi olarak kayıp yaşamamak, başkasının bizim adımıza işlem yapmasını veya gözlem yapmasını önleyebilmek için kimlik doğrulama aşaması ilk başta dikkat edilmesi gereken adımdır. Kimlik doğrulaması güvenli bir şekilde yapılmadığı takdirde, hesabımızı başka biri kullanabilir veya erişim sağlayabilir. Aynı şekilde, alışveriş işlemlerinde de bizim iznimiz dahilinde olmayan ve başka kişiler tarafından gerçekleştirilebilecek durumlar oluşabilir. Bu gibi durumların yaşanmaması için kimlik doğrulayıcı sistemlere/mekanizmalara ihtiyacımız bulunmaktadır. Bu çalışmanın ilk bölümünde, kimlik doğrulama sisteminin ne olduğundan, çeşitlerinden ve tarihsel gelişiminden bahsedilmektedir. İkinci bölümünde, internet üzerindeki banka hesabına girişte kullanılan/kullanılabilecek kimlik doğrulama metotları anlatılmaktadır. Üçüncü bölümünde, banka kartları ile alışveriş işlemleri yaparken kullandığımız/kullanabileceğimiz kimlik doğrulama süreçleri ve metotları anlatılmaktadır. Son olarak, çalışmada bizim geliştirdiğimiz yeni kimlik doğrulama metodu ve mevcut kimlik doğrulama protokollerinin güvenlik açısından karşılaştırılması verilmektedir. Kimlik doğrulama sistemleri, sağladığı güvenlik özelliklerine, hızına, kullandığı teknolojiye göre değişim gösterebilmektedir. Her bir sistem ondan önce gelen diğer sistemden daha yararlı olma potansiyeline sahiptir. Burada sunulan algoritmalar ve protokoller, mevcut durumda finansal alanda kullandığımız/kullanabileceğimiz birçok kimlik doğrulama yöntemlerini görme imkanını bize sağlamaktadır. Yeni sunulan kimlik doğrulama yöntemi ile birlikte de finansal alandaki kimlik doğrulama metotlarına yeni bir bakış açısı getirilebilecektir.
-
ÖgeKuantum sonrası kriptografi(Lisansüstü Eğitim Enstitüsü, 2023-01-26) Gültekin, Veysel ; Özdemir, Enver ; 707191009 ; Bilgi Güvenliği Mühendisliği ve KriptografiKuantum bilgisayarların keşfi ile klasik kriptografi olarak adlandırdığımız ve günümüzde tüm kriptografik cihazlarda kullanılan kriptosistemlerin güvenliği tartışılır hale gelmiştir. Kuantum bilgisayarlardaki hızlı gelişme sonucunda kriptografideki çalışma alanı kuantum bilgisayarlara karşı güvenli sistemlerin tasarlanması ve analiz edilmesi yönünde değişmiştir. Shor Algoritması ile çarpanlara ayırma probleminin çözümünün üstel zamandan polinom zamana indirilmesiyle özellikle asimetrik kriptosistemler tehdit altına girmiştir. Günümüzde verilerin büyük çoğunluğu klasik kriptografide kullanılan anahtar anlaşma algoritmaları ile paylaşılan anahtar kullanılarak simetrik şifreleme algoritmaları ile şifreli olarak saklanmaktadır. Kuantum kriptografinin esas etkisi asimetrik kriptografiye olmasına rağmen simetrik şifreleme algoritmalarının anahtarları asimetrik kriptografi ile paylaşıldığı için tüm veriler kuantum tehtidinden etkilenmektedir. Özellikle şifrelenen verilerin geriye dönük olarak tutulabileceği göz önünde bulundurulursa ve yakın zamanda kuantum bilgisayların çarpanlara ayırma problemini etkili bir şekilde çözecek seviyeye gelmesi durumunda şu andan itibaren simetrik şifreleme algoritmaları ile şifrelenen veriler de ileride çözülebilecektir. İmza algoritmalarının etkilenmesi ise imzanın kimlik doğrulama mekanizmaları veya mesajın kaynağını doğrulamak gibi o ana ait bir kriptografik hizmeti sağladığı için daha kısıtlı olacaktır. Bu yüzden kuantum tehtidi için anahtar paylaşma probleminin çözümü imza probleminin çözümüne nazaran daha acil bir ihtiyaç olarak görülmektedir. Bu sebeplerle bu çalışmada kuantum sonrası kriptografinin anahtar paylaşma problemine getirdiği çözümler üzerinde durulacaktır. Burada bahsedilen her iki problem için de daha önceden AES'i blok şifreleme algoritması olarak bir yarışma aracılığıyla standartlaştıran NIST (Ulusal Standartlar ve Teknoloji Enstitüsü), standart bir algoritma seçmek için 2016 yılında bir yarışma başlatarak standart bir kuantum sonrası anahtar değişim algoritması ve imza algoritması seçme kararı almıştır. Bu süreçten sonra algoritma tasarımı ve analizi üzerine çalışmalar hızlanmıştır. Yarışmaya katılan algoritmalar dayandıkları zorluklara göre farklı gruplara ayrılabilir. Bunlar kafes tabanlı kriptosistemler, kod tabanlı kriptosistemler, isojeni tabanlı kriptosistemler, özet tabanlı kriptosistemler ve çok değişkenli polinom tabanlı kriptosistemlerdir. Bunların içinden en çok dikkat çeken ve üzerine çalışan homomorfik şifreleme gibi yapılarda da kullanılabilmesi itibariyle kafes tabanlı kriptosistemlerdir. Ayrıca yarışma neticesinde standart olarak seçilen Kyber Algoritması da kafes tabanlı bir anahtar değişim algoritmasıdır. Kod tabanlı kriptosistemler McEliece tarafından ortaya atılılmıştır ve asimetrik kriptografinin en eski problemlerinden biri olan kod çözme problemine dayanmaktadır. Anahtar boyutlarının büyük olması dezavantajından dolayı, kuantum tehditi ortaya çıkana kadar, görece az çalışılan bir alan olarak kalmıştır. Niederreiter kriptosisteminin ortaya çıkması ve anahtar boyutlarında iyileştirmeler yapılması ile kuantum sonrası anahtar değiştirme algoritmaları için önemli bir aday durumuna gelmişlerdir. Izojeni tabanlı algoritmalar klasik bilgisayarlarla kırılabilen bir saldırının ortaya çıkmasıyla riskli duruma düşmüştür. Her ne kadar NIST SIKE anahtar anlaşma algoritmasını alternatif algoritma adaylarına alsa da saldırı bu tarihten sonra ortaya çıktığı için kullanılması önerilmez ibaresi eklemiştir. Ayrıca bazı kriptosistemlerin analizi için üzerine yeterince çalışılmadığı düşünüldüğü için kuantum sonrası anahtar anlaşma algoritmalarının tek başına kullanılmasına şüpheyle bakılmaya başlanmıştır. Bu sebeple anahtar paylaşım problemini çözmek için kuantum sonrası algoritmalar, asimetrik kriptografi ve daha önce paylaşılmış ortak simetrik anahtarlardan en az iki tanesini kullanan hibrit anahtar paylaşım protokolleri de önerilmiştir. Bu çalışma kapsamında anahtar anlaşma problemini çözmek için geliştirilen önerilerden kabaca bahsedilecektir. Kafes tabanlı kriptosistemler ile Kod tabanlı kriptosistemlerin dayandıkları problemler ve bu problemler üzerine anahtar değişim mekanizmalarının nasıl kurulduğu anlatılacaktır. Ayrıca bazı analiz metotları anlatılarak standartta bulunan bazı algoritmaların güvenlik seviyeleri ve analizi hakkında bilgi verilecektir.
-
ÖgeNitelikli elektronik imzaların kullanılabilirliğinin değerlendirilmesi: Sistematize edilmiş kullanım durumları ve tasarım paradigmaları(Lisansüstü Eğitim Enstitüsü, 2024-12-10) Çağal, Mustafa ; Bıçakcı, Kemal ; 707201023 ; Bilgi Güvenligi Mühendisli ˘ gi ve KriptografiEl yazısı imzalara yasal olarak eşdeğer olmalarına rağmen, Nitelikli Elektronik İmzalar (QES) henüz önemli bir pazar başarısı elde edememiştir. QES, kağıt tabanlı sözleşmelere olan bağımlılığı azaltmak, güvenli dijital uygulamaları etkinleştirmek ve kamu hizmetlerini standartlaştırmak konusunda önemli bir potansiyele sahiptir. Ancak, geniş kullanım alanlarına sahip olmasına rağmen kullanılabilirliği hakkında sınırlı çalışma bulunmaktadır. Tez çalışması bahse konu boşluğu gidermek için hazırlanmıştır. Tez kapsamında çalışma benzeri kullanılabilirlik çalışmaları aracılığıyla Nitelikli Elektronik İmzaların güçlü ve zayıf yönlerini değerlendirme gerekliliğini vurgulanmış, QES kullanım durumları sistematize edilmiş, kullanım durumlarını destekleyen tasarım paradigmaları kategorize edilmiştir. Ayrıca, dört farklı QES sistemindeki kullanım durumları üzerinde yürütülen bilişsel gözden gözden geçirme sonucu elde edilen bulgular sunulmuştur. Araştırma soruları şu şekildedir: 1.Türkiye ve Avrupa Birliği genelinde Nitelikli Elektronik İmzaların(QES) tüm kullanım durumları nelerdir? "Kullanım durumları" terimi kullanılırken, standart kullanıcıların yerine getirmesi gereken görev kümesinden bahsedilmektedir. 2.Bu kullanım durumlarıyla ilişkili tasarım paradigmaları, seçenekler ve alt kullanım durumları nelerdir? 3.Bu kullanım durumları ve tasarım paradigmalarını göz önünde bulundurularak, pratik QES sistemlerinin güçlü ve zayıf yönleri ile kullanılabilirlik zorlukları nelerdir? Farklı QES sistemlerinin güçlü ve zayıf yönlerini değerlendirme konusunda bir potansiyele sahip oldukları için, araştırmanın odak noktası olarak Avrupa Birliği ve Türkiye seçilmiştir. QES süreçlerinde yer alan temel aktörler ayrıntılı olarak ele alınmış ve QES süreçlerindeki çok sayıda aktörün kullanılabilirliği etkileyebileceği keşfedilmiştir. QES kullanım durumları, bu kullanım durumlarını destekleyen alt kullanım durumları ve tasarım paradigmaları belirlenmiş ve kategorilere ayrılmıştır. Müteakiben toplamda 36 bilişsel gözden geçirme gerçekleştirilmiştir. Çalışmanın en önemli bulgusu, uzaktan imzaların diğer alternatiflere kıyasla daha kullanılabilir olmasıdır. Nitelikli Elektronik İmzaları standart kullanıcılar için daha çekici bir seçenek haline getirmek maksadıyla, Türkiye'de ve Türkiye benzeri henüz regüle edilmemiş diğer ülkelerde uzaktan imzaların yasallaştırılması gerektiği sonucuna ulaşılmıştır. Bu tezin, Nitelikli Elektronik İmzaların kullanılabilirliği üzerine araştırmaların önemli ölçüde genişletilmesi için bir temel oluşturacağı değerlendirilmektedir.
-
ÖgePrivacy and security enhancements of federated learning(Graduate School, 2024-07-12) Erdal, Şükrü ; Özdemir, Enver ; Karakoç, Ferhat ; 707211008 ; Cybersecurity Engineering and CryptographyFederated Learning has emerged as a revolutionary approach in the field of machine learning, addressing significant concerns related to data privacy and security. Traditional centralized machine learning models require data aggregation on central servers, posing substantial risks of data breaches and privacy violations. FL, on the other hand, distributes the model training process across multiple decentralized edge devices, keeping the raw data localized and mitigating the privacy risks associated with centralized data storage and processing. The motivation for this thesis stems from the growing need to enhance privacy and security in FL applications. As data privacy regulations become more stringent and public awareness of data security increases, there is a pressing demand for robust FL frameworks that can protect sensitive information while maintaining high model performance. FL's ability to leverage the computational power of edge devices, such as smartphones and IoT gadgets, makes it a promising solution for various domains including healthcare, finance, and the Internet of Things. The primary objectives of this thesis are threefold: 1. To provide a comprehensive survey of existing research on privacy-enhanced FL, synthesizing key concepts, methodologies, and findings. 2. To identify gaps, limitations, and open research questions in the current literature on privacy-enhanced FL. 3. To evaluate and compare different privacy-enhancing techniques and methodologies used in FL, assessing their effectiveness, scalability, and trade-offs. FL inherently mitigates several privacy risks by keeping data local to clients. However, it introduces new challenges, particularly related to inference attacks and model update poisoning. Inference attacks exploit model updates to extract sensitive information, while model update poisoning involves malicious clients injecting false updates to corrupt the global model. These challenges necessitate robust solutions to ensure the integrity and privacy of the FL process. Non-IID data and communication overheads further complicate FL implementation. Non-IID data, where data distributions vary across clients, can hinder model convergence and performance. Additionally, frequent and substantial data exchanges between clients and servers result in significant communication overheads, which can strain network resources. Several strategies have been developed to address these privacy and security challenges. Differential privacy introduces noise to data updates, ensuring that individual contributions remain confidential. Protocols that incorporate cryptographic signatures and Secure Multiparty Computation techniques further enhance the security of model updates and ensure data integrity. Co-utility frameworks, which promote mutual benefit between servers and clients, and robust aggregation methods also play vital roles in safeguarding FL systems. Innovative methodologies such as Flamingo and SafeFL leverage advanced cryptographic techniques to provide secure aggregation and enhance privacy preservation. These solutions collectively improve the robustness, efficiency, and security of FL frameworks, enabling their application in real-world scenarios. FL has been applied successfully in various domains, demonstrating its versatility and effectiveness. In wireless communication, FL enhances vehicular communication, localization, and semantic communication by enabling collaborative model training without data centralization. In the IoT sector, FL improves privacy and reduces data transfer costs, with significant applications in smart homes and industrial IoT. Healthcare is another critical area where FL has made substantial impacts. By allowing institutions to collaboratively train models on medical imaging and predictive analytics without sharing patient data, FL addresses stringent privacy regulations while improving model accuracy and generalizability. Studies have shown that FL can maintain high diagnostic accuracy and support personalized medicine. In the financial sector, FL addresses privacy and regulatory challenges by enabling collaborative credit risk assessment and fraud detection. By leveraging data from multiple institutions without centralizing it, FL-based models achieve higher accuracy and adaptability, enhancing the detection of fraudulent activities and improving credit scoring models. Surveys play indispensable roles and offer numerous benefits within the FL domain. They serve as comprehensive repositories of existing research, providing newcomers with a foundational understanding while guiding experienced researchers toward unexplored frontiers. By scrutinizing and synthesizing a plethora of literature, surveys identify emerging trends, highlight successful applications, and outline future research directions. Federated Learning presents a transformative approach to machine learning by enabling decentralized data processing, which addresses critical privacy and security concerns inherent in traditional centralized models. This thesis explored various facets of FL, particularly focusing on the challenges and solutions related to privacy and security, as well as its diverse applications across different sectors. Emerging trends in FL research, including advancements in cryptographic techniques, federated learning frameworks, and regulatory compliance mechanisms, underscore the need for continuous innovation and interdisciplinary collaboration. As FL continues to evolve, it holds the potential to revolutionize secure communication systems and foster a culture of security awareness and privacy by design in machine learning technologies.
-
ÖgeRostam: A passwordless web single sign-on solution integrating credential manager and federated identity systems(Graduate School, 2023) Mahnamfar, Amin ; Bıçakçı, Kemal ; 909785 ; Cybersecurity Engineering and Cryptography ProgramThe challenge of transitioning to a passwordless future is a multifaceted issue, especially as web applications continue to lean heavily on passwords for authentication. This problem is amplified in enterprise environments, where identity providers, tasked with overseeing federated identity management systems, maintain single sign-on (SSO) services that lack universal compatibility with all applications. To address these complexities, we introduce Rostam, an innovative passwordless solution that integrates credential management with federated identity systems, streamlining access to web applications. Password managers, as per the literature, broadly fall into two primary categories: password wallets and derived passwords. Derived passwords generate unique passwords for websites by amalgamating a master password with supplemental information, such as the target domain name. However, these password managers come with certain limitations, like the need for users to change their existing passwords for websites. Consequently, we have chosen the password wallets category, a prevalent choice among both commercially available and browser-integrated password managers. This approach offers a more secure and user-friendly solution for managing online credentials, allowing users to retain their current credentials securely in encrypted form. Our solution, Rostam, integrates seamlessly through a dashboard, displaying all applications accessible to a user with a single click, after completing a passwordless SSO process. This intuitive interface eradicates the need for users to memorize multiple passwords, simplifying the user experience by centralizing access to diverse applications. We've examined existing works and adhered to essential use cases and design paradigms in credential managers while designing Rostam. For instance, Rostam simplifies the setup process by offering mobile app installation, extension installation, and requiring a cloud account. It accommodates credential registration through both manual and auto-detection methods, and updates credentials manually or through auto-detection, also allowing for manual credential removal. Rostam enhances the user experience by providing various autofill credential options and handling separate subdomains. It also ensures security with manual lock and timed auto-lock features, necessitating the user to reauthenticate themselves. These diverse use cases and paradigms cater to the varied needs of users, underscoring Rostam's comprehensive approach to credential management. Many credential managers focus on thwarting server-related attacks and bolstering privacy through client-side encryption, requiring users to select and remember a potent master password. However, the memorization of such passwords poses a significant challenge. While there have been efforts to counteract this issue, such as using spaced repetition or graphical password schemes, these methods are not as robust as randomly generated long keys. Furthermore, features like the temporary or even permanent storage of the master password to enhance user-experience compromise security, and the ability to alter or reset the master password is absent in some widely used credential managers. Our proposed system prioritizes security by employing a MasterKey, instead of a Master Password, to ensure the safety of encrypted passwords stored in the credential manager. In case of a security breach, encrypted passwords remain secure even if stolen from the server. This security is achieved because all keys, including the MasterKey, are robust, randomly generated, and stored securely on the client-side without any user interference. Furthermore, we employ a dual position technique meaning that to access and recover data, the user needs access to both Rostam's servers and one of the paired devices.
-
ÖgeTwo factor authentication security(Graduate School, 2024-02-02) Kumbasar, Sümeyra ; Özdemir, Enver ; 707201011 ; Cyber Security Engineering and CryptographyThe rapid advancement of technology and the widespread use of mobile applications in various aspects of our lives have started to attract the attention of cyber attackers. As time passes, the scale of cyber attacks on these platforms is increasing, leading to reputation and financial losses for organizations. Therefore, organizations take certain security measures to protect their sensitive and confidential information from unauthorized access and its adverse consequences. Cryptographic algorithms are used to encrypt and transmit sensitive data securely when sending them to the receiving party.To ensure security, certain fundamental security principles are followed. Encryption methods are utilized to ensure the confidentiality and integrity of critical and sensitive data. Additionally, authenticaiton and authorization control are provided for accessing critical data. Authorized users should have access to these data whenever needed, and the actions performed by authorized users should be logged. With the increasing presence of the internet in our lives, the number of attacks on organizations is also rising. One of the most significant among these is DDoS attacks. DDoS attacks aim to disrupt the functioning of systems by sending far more requests than their capacity. The objective of these attacks is not to access, modify, or steal critical and sensitive data but to block and disrupt the accessibility of systems. One of the fundamental elements of resisting cyber attacks and keeping systems secure is authentication. Two-factor authentication (2FA), which is one of the advanced identity authentication methods designed to protect systems from unauthorized access, is commonly used by many organizations as an additional security layer to enhance security. However, these systems have some security vulnerabilities despite enhancing access security. In this master's thesis, we exemplify how two-factor authentication, commonly used for security, can be exploited by attackers to cause denial-of-service attacks by halting the operation of services. There are two steps in the authentication phase: first is the authentication server verifying the username and password, and second is generating a PIN and sending it to the user. In our study, to exemplify the PIN generation phase during authentication, we randomly assign a number to the server and encrypt it using 2048-bit numbers. The speed at which systems perform this operation is in milliseconds. However, as the number of requests received per second increases, the encryption process with 2048-bit numbers becomes more challenging, resulting in a decrease in the server's response rate. Additionally, freezing and slowing down are observed on the device where the program is executed. As a result, the incoming requests eventually exhibit the effects of a DoS attack, negatively impacting the server's performance. We anticipate that in mobile banking applications, simultaneous authentication requests from legitimate users will slow down the application in a similar manner, resembling a Denial-of-Service attack, causing the application to respond to fewer requests and reducing its functionality.