LEE- Bilgi Güvenliği Mühendisliği ve Kriptografi Lisansüstü Programı
Bu topluluk için Kalıcı Uri
Gözat
Başlık ile LEE- Bilgi Güvenliği Mühendisliği ve Kriptografi Lisansüstü Programı'a göz atma
Sayfa başına sonuç
Sıralama Seçenekleri
-
ÖgeA hierarchical key assignment scheme for access control in cloud computing(Graduate School, 2022-06-10) Çeliktaş, Barış ; Özdemir, Enver ; 707182002 ; Cyber Security Engineering and CryptographyToday, organizations no longer have to spend huge amounts of money on physical servers, related information technologies infrastructures such as server rooms or data centers because large initial capital expenditure and operational expenditures are significantly reduced due to relatively new method called cloud computing. In addition, administrative challenges include establishing well-designed disaster recovery and business continuity plans, building fault-tolerant and scalable systems, full-time availability, and greater collaboration with stakeholders needed by organizations. These all lead companies to outsource the services such as storage systems, large-scale computations and hosting. Among cloud deployment models, the public cloud is currently the most preferred by companies due to its cost-effectiveness, although it raises many concerns, especially for military, health, and banking organizations, where confidentiality and privacy are crucial. The main concerns of these organizations, which operate in a hierarchical manner, are confidentiality, privacy, availability, integrity, reliability, data lock-in, and regulatory compliance. Besides the above-mentioned concerns, the integration of data access control policy to any cloud deployment models by the data owner is also a challenging topic in the research community. In this thesis, we will focus on finding a solution to confidentiality and privacy concerns. The first solution that comes to mind for the focused concerns should be found in cryptography tools. It is very crucial to follow a secure key management policy by organizations to ensure the confidentiality of sensitive data using encryption. What motivates us to conduct this research is to introduce a secure, flexible, hierarchical, and practical key access control mechanism that eliminates or minimizes confidentiality and privacy concerns in the transition to the cloud for hierarchical organizations utilizing sensitive data. In this context, we will present two different hierarchical access control schemes to be used in the secure adoption of the public cloud for hierarchical organizational structures and demonstrate that the use of these schemes provides a flexible, efficient, and secure hierarchical key access control mechanism for the entirety of hierarchy. Note that these schemes can also be used for organizations that do not consume cloud services to manage their internal key management and access controls. The first proposed scheme is based on an inner product space and orthogonal projection method, whereas the second is based on Shamir's secret sharing algorithm and polynomial interpolation method. These are also different in approach. The first one adopts a top-down approach where a user of any security level can access the key/data of the same and/or lower security level by default, while the second one needs the approval of the users at the same and/or higher security level to access the key/data, in other words, it adopts a bottom-up approach. The first scheme is based on an inner product space and can be utilized in any cloud delivery model where the data owner implements a hierarchical access control policy. While distributing a basis for each class by the data owner, a left-to-right and bottom-up policy can ensure much more flexibility and efficiency, especially during any change in the structure. For each class, the secret keys can be derived only when a predetermined subspace is available. This scheme is resistant to collusion/collaboration attacks and privilege creep problems, as well as provides key recovery and key indistinguishability security. The performance analysis also shows us that the data storage overhead is much more tolerable than other schemes in the literature. In addition, the other advantage is that it requires only one operation to derive the secret key of child classes securely and efficiently. In other words, these experimental results satisfy all of the desired performance and security requirements. The second scheme is based on Shamir's secret sharing algorithm and polynomial interpolation method. We provide a secure method for each user of this entity to access the public cloud from both inside and outside the company's network. The scheme offers a secure, flexible, and hierarchical key access mechanism for organizations utilizing sensitive data. It also minimizes concerns about moving sensitive data to the public cloud and ensures that only users with sufficient approvals from the same or higher privileged users can access the data by making use of the topological ordering of a directed graph, including self-loop. Our policy in this scheme is to obtain permission approval for bottom-up access. Main overheads such as public and private storage needs are reduced to a tolerable level, and the key derivation is cost-effective. From a security perspective, this scheme is both resistant to collusion/collaboration attacks and provides key indistinguishability security. Since the key does not need to be kept anywhere, the key disclosure risk is also eliminated. In summary, in this thesis, to take full advantage of these different approaches, the data owner can choose the best one that is suitable for the security policy and hierarchical structure of the organization. If required, the data owner can also design an infrastructure that is a mixture of these two approaches.
-
ÖgeA hierarchical key assignment scheme: A unified approach for scalability and efficiency, with a specialized implementation in cloud computing(Graduate School, 2024-07-16) Çelikbilek, İbrahim ; Özdemir, Enver ; 707202005 ; Cybersecurity Engineering and CryptographyAccess control is a fundamental component of information security management, defined as the process of selectively restricting access to resources. This process includes policies and protocols that determine who can access various system resources, under what conditions, and when. It primarily aims to protect data integrity and confidentiality. The proper configuration and implementation of access control systems are crucial, especially for organizations that handle critical and sensitive data. Access methods prevent unauthorized access, thereby protecting sensitive data within the organization from disclosure, alteration, or destruction. Configuring and managing access control processes require the establishment of systems that control and monitor access to resources. These systems operate within the framework of predefined dynamic or static rules and policies. The primary goal is to ensure that only authorized users can access target resources and perform specific actions. Various access models have been developed to effectively implement access controls. These models, which regulate access to system resources, include mandatory, discretionary, role-based, rule-based, attribute-based, and identity-based access methods. Each model aims to provide solutions that meet the requirements of the access environment and comply with institutional or organizational policies. In cases where these models alone are insufficient, particularly in environments with resources and users that have different security and clearance levels, the use of multilevel access control models like Bell-LaPadula may be necessary. These and similar models can typically be configured to the needs of the access environment by combining multiple simple access models and making various additions and modifications. If the users and/or resources in an access environment have a hierarchical structure, and access to resources is granted hierarchically, this type of control is called hierarchical access control. Such access environments require various access tools and policies, along with multilevel access control models, to make access secure, hierarchical, and effective. Hierarchical key assignment schemes are one of the most crucial components within the information security management systems of organizations that handle sensitive data. As an application of hierarchical access control, these schemes ensure hierarchical and secure access to secret cryptographic keys for users at various clearance levels. In hierarchical key assignment schemes, users within the access environment are divided into different classes (groups) that form a hierarchical structure, and a unique secret cryptographic key is assigned to each class. The hierarchical structure based on these classes forms a partially ordered set, which is often represented by an access graph. Typically, these structures define public/private key components for the scheme itself, and for the classes and/or edges within the access graph. In an access graph, a user in a class at a higher security (classification) level can derive the secret key of their own class, and also the secret keys of all descendant classes, using a combination of their own class's secret key and the public/private key components of descendant classes, scheme and/or edges. These schemes serve as a crucial component of cryptographic key management systems in various critical domains today. Among these domains are cloud computing, organizational data access, healthcare systems, multilevel databases, the Internet of Things, drone swarm coordination, and the protection of customer information in the finance sector. Particularly in cloud computing environments, the presence of different user roles and access levels necessitates hierarchical and multi-layered access to system resources.
-
ÖgeA new public key algorithm and complexity analysis(Graduate School, 2023-06-23) Çağlar, Selin ; Özdemir, Enver ; 707201029 ; Cybersecurity Engineering and CryptographyWith the development of technology, many processes have begun to digitize. As a result of this digitalization, digital communication has become inevitable in our lives. Digital communication is faster and easier to access than traditional communication methods. Especially with the Covid-19 pandemic, the contribution of digitalized processes to our daily life has been visibly felt. As a result of digitization, a lot of data belonging to different data classes has been transferred to the digital environment. The transfer of information to digital media has brought about a change in the methods of storing and using data. At this point, the importance of issues such as data privacy and security has increased and the concept of secure digital communication has come to the fore. Secure digital communication deals with the provision of cornerstones of security such as confidentiality, integrity, and authentication while transferring data over digital channels. Confidentiality is the process of preventing unauthorized parties from viewing sensitive data and ensuring that only those who have been given permission can do so. This can be achieved through data encryption, access controls, and secure channels. Integrity refers to the assurance that data remains unaltered and uncorrupted during transmission, storage, and processing, ensuring that the data can be trusted and relied upon. Techniques such as digital signatures and hash functions can be used to verify the integrity of data. Verifying a user's or a device's identity when they want to access data or services is referred to as authentication. This is typically achieved through the use of digital signatures, which are cryptographic techniques that provide a way to verify the authenticity of data by verifying the identity of the sender. Together, these three principles form the foundation of secure communication. When sharing data in a public environment, the data to be transferred must be protected. In other words, there is a need to ensure that the principle of confidentiality, which is the main starting point of this study, can be provided. Cryptography, which enables encryption structures, is used to ensure confidentiality. Symmetric key cryptography, which is more efficient in terms of key length and cryptographic operation and uses the same key in encryption and decryption processes, is widely used in encryption processes. In symmetric key cryptography, the party that encrypts and decrypts the data must use the same cryptographic key. Sharing of this cryptographic key must be done securely between the parties. Asymmetric key cryptography is used at the point of sharing the symmetric key, especially in processes that are established in a public environment and where there is no opportunity for the parties to directly share keys physically. Symmetric key cryptography is based on the use of a key pair consisting of a public and private key. A public key is a key that can be shared publicly with the parties used to send encrypted data. The private key, on the other hand, is the key used in decrypting the sent encrypted data, which the owner of the key pair must keep securely. Asymmetric key cryptography is used to provide confidentiality and authentication. The fact that it can also provide authentication is a factor that increases security in key exchange processes. After the parties verify each other cryptographically at the key exchange, asymmetric key cryptography provides an environment for sharing the symmetric key to be used to secure the communication. The RSA algorithm is one of the oldest and most widely used asymmetric key algorithms. The security of the algorithm is based on the difficulty of factoring integers. In the RSA algorithm, the public key modulus is equal to the product of two large prime numbers of the same size. Revealing these two prime numbers is enough to break the algorithm. At the same time, there is the possibility of returning the message without factoring from the encrypted data. This is called the RSA problem. Research studies have shown that there may be an easier way to return a message from encrypted data without factoring. If an effective method is developed for the RSA problem, the security of many RSA-based systems will be under threat. In this thesis, a new public key algorithm, which can be an alternative to the RSA algorithm, is proposed in the case of solving the RSA problem. This algorithm is based on the use of nodal curves and the group structure is different from the RSA algorithm. In the proposed algorithm, the discrete logarithm problem is thought to be harder, since the group structure in which the algorithm works is based on polynomial arithmetic and is also inspired by elliptic/hyperelliptic curves. At this point, it is assumed that the proposed new algorithm may be more durable to the problem in the RSA algorithm. At the same time, a new group operation algorithm, which is an addition algorithm, is presented by modification of the Mumford Representation and Cantor Algorithm in order to perform the group operation on the nodal curves. The performance comparison of the group operation presented on the nodal curves and the Cantor algorithm has been made. Compared to the Cantor algorithm, the presented new group operation was found to be more effective. In addition, the proposed algorithm has a probabilistic behavior. In other words, even if the data to be encrypted does not change, a structure is presented that can enable the encrypted data to be formed differently. The RSA algorithm has a deterministic behavior, additional padding is needed to produce different encrypted results from the same data. Since the proposed public key algorithm is based on polynomial arithmetic, there is no performance advantage compared to the RSA algorithm. We can state that there is a trade-off between security and performance. In order to show the practical applicability of the presented new solution, a performance comparison with the RSA algorithm has also been made. The performance problem is caused by the exponential increase in the secret key with the increase in the degree of the nodal curve used. In other words, it has been seen that the algorithm proposed in the decryption phase is slower than the RSA algorithm. However, since the decryption process in asymmetric key cryptography is generally not performed by individual users, it is thought that powerful servers will not be affected by this performance problem. During the tests, the SageMath library and the Python programming language were used.
-
ÖgeAnalyzing individual data for insider threat detection(Graduate School, 2024-07-16) Yakar, Burak ; Özdemir, Enver ; 707211003 ; Cybersecurity Engineering and CryptographyInsider threats have been recognized as one of the most significant risks in cybersecurity. Research indicates that a majority of security breaches are caused by attacks or vulnerabilities originating from within the organization. Even with the most secure systems, as long as humans are part of the system, absolute security cannot be guaranteed. Technology is everywhere in our lives. People use smartphones, smartwatches, computers, and various other smart devices, all of which collect data to some extent. This data collection occurs not only on a personal level but also across businesses of all sizes. As businesses invest heavily in their operations, they need to secure their assets. To protect these assets, businesses invest in security measures. While some of these investments are physical precautions against physical risks, others are related to cybersecurity to mitigate cyber risks. Even if businesses build the best IDS (Intrusion Detection System) or IPS (Intrusion Protection System), there may still be ways for attackers to infiltrate and sneak in. This is because humans are the weakest component of any ICT (Information and Communications Technology) security system and present the greatest risks and threats to a company, organization, or system. Insider threats are cybersecurity threats that originate from authorized users, such as employees, business partners, contractors, vendors, and former employees. Misusing legitimate user credentials and account hijacking are some methods to carry out these intentions. These actions are not necessarily all intentional; some may be unintentional. However, as a result of these actions, the confidentiality, integrity, and availability of systems and data are compromised. The cost of these actions can cause significant expenses that most SMEs (small to medium-sized businesses) cannot afford. This study focuses on defining insider threats, mitigating security risks leading to insider vulnerabilities, and preventing insider threats by analyzing individual data using the random forest algorithm. The aim of this study is to find a method to detect malicious intentions and prevent potential attacks before they occur.
-
ÖgeDesign and analysis of privacy-preserving and regulations-compliant central bank digital currency(Graduate School, 2024-07-12) Doğan, Ali ; Bıçakcı, Kemal ; 707211012 ; Cybersecurity Engineering and CryptographySignificant advances has been made in the field of Central Bank Digital Currency (CBDC) in the last five years. These advances are available not only in the academic world but also in central banks. Currently, more than 130 countries continue their CBDC studies at research, pilot and proof of concept levels. The increased interest in CBDC can be attributed to various factors such as the increasing progress in digital payment technologies, the widespread use of cryptocurrencies in the digital money market and the advantages brought by this technology. In addition to these advantages, there are challenges and problems that have not yet been resolved in order for CBDCs to reach the maturity level. One of these problems is the conflict between efforts to protect the privacy of digital currency users and the compliance mechanisms introduced by states to ensure financial stability and social order. States try to prevent and monitor financial crimes through regulations such as combating dirty money and preventing financing of terrorism. However, such regulations could lead to citizens' lives being completely monitored in the transition to digital money. In addition to this conflict, a significant part of the existing CBDCs are operated on a blockchain-based system. Due to the transparent structure of the blockchain, parties included in the network can track and monitor users' transactions, but transaction privacy is ignored. In the present study, solutions to the mentioned privacy problems are introduced with cryptographic techniques such as zero knowledge proofs, threshold cryptography, and homomorphic encryption. In the proposed system, the user's balance is kept homomorphically encrypted in the blockchain. To perform a transfer transaction, the sender encrypts the amount he wants to transfer with his own public key, the receiver's public key, and the regulators' public key. The sender then creates a zero-knowledge proof that the amount is the same in all three ciphertexts. Since the transaction is processed through encrypted texts, the user must create a range proof that the balance he has is sufficient. After creating all the proofs and transmitting them to the blockchain, the nodes confirm the transaction and the user's balance is homomorphically reduced via the ciphertext and the recipient's balance is increased. In any suspicious case, the user's transaction history can be traced back by government institutions called regulators. However, threshold encryption was used to ensure that this control was not left to the initiative of a single institution. These institutions must reach a consensus and after reaching the threshold value, they can see the transaction details. Additionally, techniques have been suggested so that commercial banks can continue their services in this system.
-
ÖgeDetecting malicious activity inside of the network(Graduate School, 2023-12-20) Kumbasar, Ayşenur ; Özdemir, Enver ; 707201002 ; Cybersecurity Engineering and CryptographyIn today's world with the global development and digitalization, applications and services used in banking and finance sectors, as in all sectors, have started to adapt to the online world quickly. The increase in the rate of transition to the Internet environment shows that the issue of security is becoming more and more important and serious for banks and customers. Companies serving in the financial and banking sectors are an attractive target for cyber attackers in terms of damage to the target system and data obtained by attackers. The protection of information systems containing important and sensitive business and customer information, such as databases, servers, computers, networks used, is of high importance. In the same way, providing a secure and robust online communication environment in the services provided to customers and ensuring that data is transmitted in reliable environments is one of the most important elements in the banking sector Banks are also making major investments in security systems to ensure secure communication and the protection of personal and business information and documents as a precaution against this increasing number of cyber attacks. With these systems, they have the potential to prevent such attacks by detecting and responding to abnormal and unauthorized activities. However, research shows that the majority of cyber attacks are carried out by insiders. Most security products in use focus on external threats. However, if the attacker is a person working within the organization, these systems may be insufficient to detect such activities. The inside attacker has legitimate access privileges to sensitive data, systems, networks that outsiders do not have. It is difficult to predict and prevent as the malicious user inside follows legitimate paths and methods. Since the systems have detailed information about the internal organization such as the corporate network, they can misuse sensitive and confidential data and cause irreversible damage to the organizations by creating great losses. Therefore, it can be said that the cost of damage caused by internal threat is much higher than external threat. This study focuses on detecting insider threats by monitoring users with a behavioural focus. By examining normal user behaviour and malicious user behaviour with SVM, KNN and Random Forest algorithms, it is aimed to detect internal threats and help minimize the damage that can be done to the institution with preventive controls that will come with it.
-
ÖgeDistributed anomaly-based intrusion detection system for IoT environment using Blockchain technology(Graduate School, 2022-02-04) Hejazi, Nouha ; Özdemir, Enver ; 707191006 ; Cybersecurity Engineering and CryptographyThe IoT world is growing rapidly. One of the most important challenges facing the commercialization of IoT-related innovations is preserving system security and privacy of users' information as well as achieving high acceptance levels. Unfortunately, IoT inherits security threats from its enabling technologies and adds many constraints on any applicable security solution because of the special characteristics of IoT systems which make preserving the system's security more challenging. This increases the landscape of threats and makes the system vulnerable to inside as well as outside attacks. However, IoT networks are usually implemented on a vast scale which makes them produce a huge amount of data during communication. This fact makes machine learning a promising solution for securing IoT systems. This huge data can be analyzed and used to detect abnormal behavior or anomalies. Nevertheless, according to resource and power constraints that IoT devices operate in, it is vital to reduce the needed storage and processing power needed for the detection algorithm or to propose an architecture that distributes the load over network nodes. Instead of implementing the Intrusion Detection System in a centralized way and handling data from the whole IoT system - which makes the system exposed to attacks and create single point failure or puts it at risk if the central server is compromised - distributed collaborative architecture could be used to take advantage of the massive deployment of IoT devices. The collaborative intrusion detection systems have better knowledge of their protected environments and provide a solution for the applications that are sensitive to user privacy. In this work, we are going to introduce a new security solution for intrusion detection in IoT systems. Our proposed solution utilizes distributed collaborative architecture trying to take advantage of IoT structure and overcome its limitations. A federated learning method is proposed in this thesis. Using the private dataset, the local model gets trained by each node. Then, the parameters of its local model are shared with other nodes for the sake of generating a better global model. This thesis proposes utilizing a Generative Adversarial Network (GAN) for the purpose of detecting anomalies. The model will be trained on the normal system behavior and let the generator mimic attacks while the discriminator detects anomalies based on their difference from the normal behavior. This technique could offer a solution for the problem of limited data points that represents malicious behavior. Additionally, this paper suggests employing an autoencoder for feature extraction. There are four main purposes for doing so. The first is to improve the efficiency of the GAN training process by lowering system congestion. The second is minimizing the sample size required. Similarly, the third purpose is to make the training and classification process lighter and easier. Finally, it can also conceal the data for scenarios where the device shares its data along with its model's parameters to gain trustworthiness. On the other hand, our solution will employ data sharing and mutual trust between system devices using blockchain technology. The collaborated devices share their model's parameters over the blockchain. In this way, they can compute the general global model by averaging all shared models or they can check their results using their neighbors' models. Furthermore, in distributed peer-to-peer IDS network alert exchange between the different IDS nodes is vital to detect anomalies and determine the trustworthiness of the nodes of the network. Additionally, system devices might share an encoded version of their data over the blockchain along with their models' parameters to enable other devices to verify the detected intrusion. To determine the trustworthiness of a node, a calculation can be initiated based on the fulfillment of received alert-related information. Then, the blockchain registry would include the alerts generated by each IDS node. Consequently, the collaborating nodes would depend on the consensus protocol to judge the validity of the alerts before inserting them on the blockchain. However, since each IoT system might have a different structure and characteristics according to its functionality and the circumstances it is implemented in, different IoT systems might apply our suggested solution with different settings. Also, according to the limitations that faced our research in terms of time and research equipment we are going to present a general structure for the proposed system and discuss it from security aspects that govern collaborative distributed IDSs.
-
ÖgeGenerating synthetic data for user behavior based intrusion detection systems(Graduate School, 2024-07-16) İbrahimov, Ughur ; Özdemir, Enver ; 707211009 ; Cybersecurity Engineering and CryptographyIntrusion detection systems are at a critical point in the effort to mitigate cyber vulnerabilities. While malicious actors are increasing day by day, the demand for multifunctional IDS models constantly increases. Since data plays the most crucial role in all cybersecurity measures, obtaining data is really important while developing these security precautions. At this point, synthetic data provides unique contributions to overcoming the problem of data scarcity. This thesis examines the intrusion detection concept, necessity of synthetic data in cybersecurity and synthetic data generation methods. The analyse provides information about relationship between synthetic data and intrusion detection systems, application process of synthetic data and privacy topics while generating and implementing artifical data for cybersecurity measures. After a detailed analyse, we decide generation method and tool for the purpose of this thesis. Since there are various methods and techniques to produce synthetic data for different purposes, we need to choose the right modeling and method for our work. Synthetic data producing methods include machine learning approaches like generative adversarial networks (GAN), variational autoenconders (VAE) furthermore, apporaches like simulation, interpolation and extrapolation, statistical modelling and more others. In this thesis, we generate synthetic data that shows daily behavior of the user who works as information technologies support technician and deals with tickets. We use Python language libraries are implemented for technical side to produce manufactured data. Moreover, scenario was developed to establish a synthetic dataset that is close to real life incidents as possible. Constants like ticket identifications, ticket types, action types are clearly defined in order to generate balanced synthetic data. One of the necessities of synthetic data usage in different industries is it being constructed in a balanced shape. Ticket types are defined as task, bug, support, question, feature, then we defined actions that contains work on ticket, reassign ticket, attach file to a ticket, and others. Although approximately 35,000 movements were created over a two-week period, the duration of the experiment could be extended over a longer period of time for a more realistic distribution in later developments. We also decided to make the synthetic data show actions between 9 A.M and 5 P.M which are work hours. The time spent is calculated from the difference between randomly assigned start and finish times between these hours. xxii Generated data is stored in Excel file, which contains approximately 35000 lines. It is possible to change the amount according to the purpose by making changes in the code. The statistical distribution of the result is shown in histograms at the end.
-
ÖgeGroup authentication and key establishment scheme(Graduate School, 2024-09-03) Güzey, Sueda Rüveyda ; Özdemir, Enver ; 707192005 ; Cybersecurity Engineering and CryptographyThe authentication phase serves as the foundational cornerstone for ensuring secure data tranmission and confidential communication. In the ever-expanding landscape of devices communicating with each other, espacially IoT, the conventional approach which is one-to-one authentication poses a significant challenge, burdening compu- tation and communication with a growing strain due to the escalating complexity of the environment. In other words, standard cryptographic algorithms—such as RSA, which relies on prime number factorization, and Diffie-Hellman Key Exchange, which is based on the hardness assumption of the discrete logarithm problem—have traditionally been used for authentication. However, these algorithms may not be suitable for resource-constrained devices, particularly in a dynamic and crowded environment. Group authentication schemes (GASs), representing a innovative approach to authentication. Group authentication involves verifying that a designated set of users are part of a specific group and, in case of need, subsequently distributing a shared key among them for confidential group communication. That is, GAS can authenticate many users simultaneously. The recently presented group authentication algorithms mainly exploit Lagrange polynomial interpolation along with elliptic curve groups over finite fields. These systems require collecting a specific number of legitimate users' private keys to complete the authentication phase. That is, the scheme requires each entity to acquire tokens from all other entities, making it impractical for large-scale environments. The need to secret sharing makes these algorithms vulnerable to disruption by a single malicious entity. Additionally, in the current algorithms based on polynomial interpolation the cost of authentication and key establishment also depends on the number of users, which poses a scalability challenge. Introducing a novel methodology, this study advocates the adoption of linear fields for group authentication and key generation, scalable to groups of varying sizes. Leveraging linear spaces minimizes the computational and communication burdens associated with establishing a shared key within the group. The inherent benefits of utilizing vector spaces render this proposed method particularly well-suited for energy and resource constrained devices, positioning it as a viable option for integration within Internet of Things (IoT) networks. A standout feature of our work is its ability to empower any user in a group to elevate a non-member to a member status. This feature is a potential utility for future autonomous systems. The scheme is thoughtfully crafted to ensure that sponsors of these new members can be identified by all members within the group. Moreover, unlike the polynomial interpolation based, the proposed easily identifies non-members, which helps prevent denial of service (DoS) attacks that previous group authentication algorithms struggled with. The method proposed in this thesis offers a lightweight group authentication solution that verifies participants in environments with energy- and resource-constrained devices, independent of the number of users. During the group authentication phase, a subspace of the universal space, along with its basis set and a polynomial, are selected by the corresponding group manager and kept secret. The basis sets, derived from the chosen main basis set and polynomial, are distributed to users as their private keys. In each group authentication session, a random vector that does not lie in the subspace and a nonce vector are selected and published. Using their basis sets, users are expected to find the projection vector of the published vector and calculate the inner product with the session nonce vector. Participants are verified by sending some bits of the calculated value. In the key generation phase, the steps are identical to those in the group authentication phase. For the projection step, the diffirent vectors are publicly disclosed to be utilized for the same purpose. Participants obtain the key for group communication by performing the same operations. Notably, individuals not having a basis set can not participate in the key building phase which enhances the overall security level. In scenarios where the group administrator is not directly involved, this study enables any authorized user within the group to add new members. The authorized user adjusts their basis set by selecting specific elements and shares it with the prospective group member. By leveraging this basis set, it becomes possible to identify the individual responsible for adding someone to the group. Within the scope of this thesis, we delve into the recent group authentication studies in the literature. We address the challenges of these studies and propose a novel linear based group authentication scheme that overcomes them. Additionally, we present real-time analyses comparing our algorithm with existing studies, supported by tables and graphs.
-
ÖgeGroup authentication for next generation networks(Graduate School, 2022-05-12) Aydın, Yücel ; Özdemir, Enver ; Kurt Karabulut, Güneş Zeynep ; 707172003 ; Cybersecurity Engineering and CryptographyIn this thesis, it is proposed and simulated to perform handover operations as a group to decrease time latency and the number of communication. The security aspects of the authentication and handover for drone swarms are presented in the thesis. The reason to select drone swarms is to examine the authentication in a group and to raise the use of drones everywhere in daily life. The number of drones used for military or commercial applications is getting higher every day. Border security, visual shows, and cargo delivery can be some examples of drone applications. Due to their flying time and limited coverage area, a single drone cannot perform intensive tasks. While providing mobile service via aerial base stations, some UxNBs can turn back to the control station and new drones can be sent to the area to accomplish the tasks. Due to these reasons, it is preferred to use drone swarms for intensive tasks rather than a single drone. The first security problem for the drone swarm is the authentication of the new drones sent by the drone control station join to the swarm. If it is possible to include a drone in the swarm without authentication, any intruders can impersonate a drone and send it to the swarm for various attacks. In addition to the authentication, the communication inside the swarm should be encrypted and each party should use a group key. The group key may also be shared with the new authenticated drone. The next security requirement for the drone swarm is the mutual authentication of two drone swarms to perform more intensive tasks. If the authentication solution for the UAV authentication in 5G is exploited for mutual authentication, the number of communication and scalability should be taken into consideration since each party from a different swarm should perform authentication with the UAVs from another swarm. Group authentication solutions may be used to overcome scalability and the high number of communication issues. Drone swarms also have security and latency issues for the handover operations. There are two kinds of handover operations for drone swarms. One is the handover of drone swarms from serving terrestrial base station to the new base station. The next one is the handover of UxNBs if the base station is not terrestrial but an aerial. The serving UxNB may be out of flying time and drone swarm may start to receive service from new UxNB. The lightweight group authentication scheme is applied to the authentication and handover operations for the drone swarms in the thesis. 5G UAV authentication and handover methods and group-based solutions are implemented in the simulation and the results are compared. According to the results, the group authentication solutions provide better time, and less communication for the drone swarms.
-
ÖgeImplementation and analysis of the secret key generation algorithm using software defined radios(Graduate School, 2024-06-27) Alper, Ertuğrul ; Özdemir, Enver ; 707211016 ; Cybersecurity Engineering and CryptographyAs the use of wireless communication systems increases, their security has become a critical focus due to various technological advancements. Given the diversity of applications and technologies, it is not possible to address the security concerns of all wireless systems in a single study. Therefore, this thesis presents the design, analysis, and implementation of a cryptographic secret key generation algorithm within a two- and three-node distributed wireless system featuring full-duplex multiple access channels, aimed at improving security in wireless communications. In addition, the thesis includes a comprehensive review of the literature on multiple access channels and computational techniques, discussing the findings in detail. In the following chapters of the thesis, wireless communication systems are explained, and then multiple access channels are examined in detail. In this section, especially wireless full-duplex multiple access channels (W-FMAC) are emphasized, and this technology is used in simulations and implementations. In addition, examples of wireless half-duplex multiple access channels (W-HMAC) and non-orthogonal multiple access channels (NOMA) are discussed comprehensively with their usage areas. Afterwards, function computation (FC) techniques are defined which compute signals while transmitting them in the air and providing meaningful information to the receiver. In this section, it is mentioned how these calculations can be made in the air and what kind of designs should be made in the sender and receiver nodes. Afterwards, it is emphasized that the analog function computation (AFC) technique is used in this project and pre-processing and post-processing functions are used in the transmitting antenna and receiving antenna recursively. In addition, digital function computation (DFC) is also examined in this section and compared with the AFC technique. This valuable information provided by wireless communication is critical in simulating and implementing the cryptographic key generation algorithm described later in the thesis, and two- and three-node test systems are created on this basis. In the following part of the thesis, the cryptographic key generation algorithm, which is the main theme of the study, is discussed in detail. First, using the wireless full-duplex multiple access channels technique, a system consisting of N users is designed and presented with the system model. Then, the AFC technique is used, which is required for the implementation of the secret key generation algorithm, and the processing functions are explained. In this section, it is emphasized that the secret keys chosen by the nodes are Gaussian prime numbers, and it is proved that those prime numbers form the main basis of the system. Afterwards, the channel model is created in the simulation environment, and the channel parameters are shown. Subsequently, error models are created to measure the success of the secret key generation algorithm implemented in the test environment. The basis of these error models is determined as the distance between users and the channel estimation coefficients, and the success of the system is measured by performing Monte Carlo simulations in the test environment. The detailed explanations of the results are then given in the performance evaluation section. Afterwards, the results obtained are discussed and the ideal values of the system parameters are shared to improve the implementation of the algorithm. Furthermore, the term software defined radio (SDR) is explained, and its abilities and usage areas are shown. GNU Radio, the most common open source software toolkit used to program SDRs, is mentioned. Then, the platforms compatible with GNU Radio, the installation process, and the creation of software blocks are investigated. This discussion is enhanced with sample designs and flowgraphs. In the following section, Universal Software Radio Hardware (USRP), which is the hardware combination of software-defined radios, is discussed, and the hardware architecture is explained. Then, different Ettus USRP devices are compared according to various factors and their pros and cons are presented. In addition, it is emphasized that the USRP B210 model is used in this study. In addition, it is described how to use the USRP receiver and source blocks in GNU Radio and what parameters need to be set. In the next stage of the thesis, based on the basic information described in the previous sections, it is mentioned how the secret key generation algorithm is implemented using an SDR. In this section, first, the software and hardware required to perform this operation are shown. Then, it focuses on how the secret key is transferred for a two-user system and how it is reconstructed in the receiver node. This follows a detailed diagram of the transmitter and receiver systems created on GNU Radio, the flow chart, all the parameters used, and the software blocks created. Finally, the secret key value obtained in this study is compared with the theoretically calculated secret key, and error calculations are made. In the final section of the thesis, a summary of all these operations is provided and the practical implementation of the study is highlighted once more. Finally, the thesis outlines the scope of subsequent research, presented as an extension of this work, and identifies the areas that will receive further development.
-
Ögeİkili kuadratik form ̇ile grup kimlik doğrulaması(Lisansüstü Eğitim Enstitüsü, 2023-01-31) Aksoy, Filiz ; Özdemir, Enver ; Özer, Özen ; 707191004 ; Bilgi Güvenligi Mühendisli ˘ gi ve KriptografKriptoloji, dijital ortamda taraflar arasında güvenli iletişimin gerçekleşmesi için gerekli algoritma ve protokol dizaynını amaç edinen bilim dalıdır. Sanal ortamdaki herhangi bir veri akışının güvenliği kriptografik temel taşlar ile sağlanır, Günümüzde teknolojinin gelişmesi ve internetin yaygınlaşması ile bilgi paylaşımı da kritik bir önem kazanmakta ve güvenli bilgi paylaşımı için sürekli yeni modeller geliştirilmektedir. Kriptografi biliminin amacı yalnızca mesajları şifreleme ve deşifre algoritmaları geliştirmek değil, aynı zamanda bilgi güvenliği gerektiren gerçek dünya sorunlarını çözüme kavuşturmayı sağlamaktır. Diğer bir deyişle sanal ortamda akan verilerin güvenli transferini sağlayacak uygun yapıtaşları hazırlamaktır. Bu yapıtaşların uygunluğu birçok faktöre bağlıdır. Mevcut donanım yapısına ve kullanıcıların beklediği veri akış hızına uygunluğu en öncelikli hedefler arasındadır. Dijital ortamdaki haberleşmenin güvenliği önceden belirlenen dört hedefin sağlanması ile mümkün olabilmektedir. Bu hedeflerin ilki mesajın gizliliği olarak ifade edilen gizlilik (confidentiality) kavramıdır. Mesajın karşı tarafa güvenli bir şekilde iletilmesi için tasarlanan algoritmaların ana amacı mesajın üçüncü taraflar tarafından okumasını engellemektir. Dijital ortam herkes tarafından görülebilir kabul edilmektedir. Dolayısı ile yalın halde gönderilecek bir mesaj herkes tarafından okunabilecektir. Mesajın sadece önceden belirlenen alıcılar tarafından okunabilmesi güvenli haberleşmenin en önemli öğelerinden biridir. Bir diğer amaç ise veri bütünlüğü (data integrity) yani mesajın içeriğinin değişmesini önlemektir. Mesajın içeriği iletim esnasında oluşabilecek hatalardan veya araya giren kişilerden kaynaklı değişikliğe uğrayabilmektedir. Bu tür manipülasyon ve değişimleri engellemek için genellikle özet (hash) fonksiyonları kullanılmaktadır. Güvenli haberleşmenin sağlaması gereken amaçlardan bir diğeri ise kimlik doğrulama (authentication), yani mesajın kaynağının ve alıcının doğrulanmasıdır. Bunun için mesajı oluşturan kişi ve zaman damgası gibi bilgileri içeren dijital imza gibi yöntemler kullanılmaktadır. Son olarak ise gönderilen mesajın gönderici tarafından inkar edilememesi (Non-repudiation), yani mesajı gönderenin mesajı kendisinin göndermediğini iddia edememesidir. Dijital imza gibi yöntemler kimlik doğrulaması ile birlikte mesajı gönderenin inkar etme durumunu da ortadan kaldırmaktadır. Güvenli haberleşmenin en önemli sac ayağı gizlilik simetrik kriptografik algoritmalar ile sağlanmaktadır. 1974 ten günümüze kadar nerdeyse tüm dijital haberleşme kanalları standart olan simetrik anahtarlı algoritmaları kullanmaktadır. Simetrik anahtarlı kriptografik sistemlerde gönderen ve alıcı taraflarının her ikisi de aynı anahtara sahip olmak zorundadır. Her ne kadar gizlilik standart simetrik şifreleme metotları ile sağlanıyor olsa da, tarafların aynı anahtarı elde etmesi en önemli problem halini almaktadır. Tarafların anahtar paylaşımı yapmadan önce birbirlerinin kim olduklarını tespit etmesi yani kimlik doğrulama yapması beklenmektedir. Kimlik doğrulama sonrasında anahtar değişimi yapılmaktadır. Kimlik doğrulama ve anahtar değişimi algoritmaları şu ana kadar sadece bir alıcı ve bir göndericinin olduğu ortamları göz önünde bulundurarak dizayn edilmiştir. Fakat günümüzde artık haberleşme birebir değil onlarca hatta binlerce aletin aynı anda veri alış verişi yaptığı iletişim sistemlerinden oluşmaktadır. Mesela nesnelerin interneti (Internet of Things - IoT) gibi teknolojilerin de gelişmesi ile hem aynı anda bir çok aynı amaç için kullanılan aletler hızlı kimlik doğrulaması ve anahtar değişimi yapması gerekmektedir. Veri akışı her bir aletten diğerlerine gittiği için ortamda bulunan onlarca belki de binlerce aletin her biri için kimlik doğrulaması yapması ve anahtar paylaşımı yapması beklenmektedir. Ayrıca, iletişim ağına dahil olan tüm cihazların teknik kapasitelerinin aynı olmadığı düşünüldüğünde düşük işlemcili cihazları da destekleyen bir modele ihtiyaç günden güne artmaktadır. Daha fazla cihazın veri akış trafiğine dahil olacağı beklenildiğinden, kısa süre içerisinde çoklu kimlik doğrulama ve çoklu ortam için etkin anahtar değişimi algoritmalarının daha fazla ihtiyaç haline geleceği açıktır. Bu çoklu ortamlar için etkin kimlik doğrulama algoritması geliştirilecektir. Bunun yanında pratikte kullanılabilecek anahtar belirleme algoritmasında sunulacaktır. Sunulan algoritmaların performans değerleri analiz edilecek ve kripto analizleri yapılarak güvenlik parametreleri sunulacaktır. Dizayn edilen algoritmalarda yeni bir matematiksel aygıt kullanılacaktır. Bu aygıt uzun zamandır sayılar teorisi alanında bilinen ikili kuadratik formlardır. İlk bölümde kriptografiye kısa bir giriş yapılarak, simetrik ve asimetrik anahtar algoritmalarının temel yapıtaşları ve bu algoritmaların güvenilirliğinden örneklerle bahsedilecektir. Mevcut kriptografik yapıtaşlarının güvenli haberleşmede istenilen özellikleri sağlamada nasıl kullanıldığı örneklendirilecektir. Bu bağlamda elektronik posta servislerinin güvenliğini sağlayan en önemli protokollarden PGP uygulamasından bahsedilecek ve yapıtaşların etkin bir şekilde kullanımına örnek verilecektir. İkinci bölümde ise, birebir kimlik doğrulama ve grup kimlik doğrulama detaylı olarak anlatılacaktır. Daha sonra son zamanlarda çoklu kimlik doğrulama ve çoklu anahtar paylaşımı için dizayn edilmiş grup kimlik doğrulaması üzerine yapılan çalışmalardan bahsedilecektir. Üçüncü bölümde sunacağımız grup kimlik doğrulama ve anahtar değişimi algoritmaları için matematiksel yapıtaşları olan ikili kuadratik formlardan detaylı bahsedilecektir. İkili kuadratik formlar, primitif formlar, pozitif belirli formlar, kuadratik formların denkliği, denklik sınıfı, indirgenmiş formlar bu bölümde anlatılmaktadır. Üçüncü ve son bölümde ise, ikili kuadratik form ile grup kimlik doğrulama için önerilen modelin detayları ve teorik performansının diğer grup kimlik doğrulama modelleri ile karşılaştırılması yer almaktadır.
-
Ögeİnternet bankacılığında kimlik doğrulama yöntemleri(Lisansüstü Eğitim Enstitüsü, 2022-02-04) Hezer, Şahadet Gülşah ; Özdemir, Enver ; 707191008 ; Bilgi Güvenliği Mühendisliği ve KriptografiGünümüzde dijitalleşme büyük ölçüde artmıştır. Buna bağlı olarak gün geçtikçe yaptığımız tüm işler internet üzerinden gerçekleştirilebilmektedir. Özellikle, finansal alanda yaptığımız internet üzerindeki işlerde maddi açıdan kayıp yaşamamak adına, işlemi yapabilmek için kullandığımız tüm sistemlere güven duyma ihtiyacı ortaya çıkar. Bu açıdan, işlemi başlattığımız ilk adım olarak kimlik doğrulama süreci önem arz etmektedir. Finansal alanda, kimlik doğrulama adımı genel olarak bankacılık hesabına girişte ve alışveriş işlemlerinde yer almaktadır. Maddi olarak kayıp yaşamamak, başkasının bizim adımıza işlem yapmasını veya gözlem yapmasını önleyebilmek için kimlik doğrulama aşaması ilk başta dikkat edilmesi gereken adımdır. Kimlik doğrulaması güvenli bir şekilde yapılmadığı takdirde, hesabımızı başka biri kullanabilir veya erişim sağlayabilir. Aynı şekilde, alışveriş işlemlerinde de bizim iznimiz dahilinde olmayan ve başka kişiler tarafından gerçekleştirilebilecek durumlar oluşabilir. Bu gibi durumların yaşanmaması için kimlik doğrulayıcı sistemlere/mekanizmalara ihtiyacımız bulunmaktadır. Bu çalışmanın ilk bölümünde, kimlik doğrulama sisteminin ne olduğundan, çeşitlerinden ve tarihsel gelişiminden bahsedilmektedir. İkinci bölümünde, internet üzerindeki banka hesabına girişte kullanılan/kullanılabilecek kimlik doğrulama metotları anlatılmaktadır. Üçüncü bölümünde, banka kartları ile alışveriş işlemleri yaparken kullandığımız/kullanabileceğimiz kimlik doğrulama süreçleri ve metotları anlatılmaktadır. Son olarak, çalışmada bizim geliştirdiğimiz yeni kimlik doğrulama metodu ve mevcut kimlik doğrulama protokollerinin güvenlik açısından karşılaştırılması verilmektedir. Kimlik doğrulama sistemleri, sağladığı güvenlik özelliklerine, hızına, kullandığı teknolojiye göre değişim gösterebilmektedir. Her bir sistem ondan önce gelen diğer sistemden daha yararlı olma potansiyeline sahiptir. Burada sunulan algoritmalar ve protokoller, mevcut durumda finansal alanda kullandığımız/kullanabileceğimiz birçok kimlik doğrulama yöntemlerini görme imkanını bize sağlamaktadır. Yeni sunulan kimlik doğrulama yöntemi ile birlikte de finansal alandaki kimlik doğrulama metotlarına yeni bir bakış açısı getirilebilecektir.
-
ÖgeKuantum sonrası kriptografi(Lisansüstü Eğitim Enstitüsü, 2023-01-26) Gültekin, Veysel ; Özdemir, Enver ; 707191009 ; Bilgi Güvenliği Mühendisliği ve KriptografiKuantum bilgisayarların keşfi ile klasik kriptografi olarak adlandırdığımız ve günümüzde tüm kriptografik cihazlarda kullanılan kriptosistemlerin güvenliği tartışılır hale gelmiştir. Kuantum bilgisayarlardaki hızlı gelişme sonucunda kriptografideki çalışma alanı kuantum bilgisayarlara karşı güvenli sistemlerin tasarlanması ve analiz edilmesi yönünde değişmiştir. Shor Algoritması ile çarpanlara ayırma probleminin çözümünün üstel zamandan polinom zamana indirilmesiyle özellikle asimetrik kriptosistemler tehdit altına girmiştir. Günümüzde verilerin büyük çoğunluğu klasik kriptografide kullanılan anahtar anlaşma algoritmaları ile paylaşılan anahtar kullanılarak simetrik şifreleme algoritmaları ile şifreli olarak saklanmaktadır. Kuantum kriptografinin esas etkisi asimetrik kriptografiye olmasına rağmen simetrik şifreleme algoritmalarının anahtarları asimetrik kriptografi ile paylaşıldığı için tüm veriler kuantum tehtidinden etkilenmektedir. Özellikle şifrelenen verilerin geriye dönük olarak tutulabileceği göz önünde bulundurulursa ve yakın zamanda kuantum bilgisayların çarpanlara ayırma problemini etkili bir şekilde çözecek seviyeye gelmesi durumunda şu andan itibaren simetrik şifreleme algoritmaları ile şifrelenen veriler de ileride çözülebilecektir. İmza algoritmalarının etkilenmesi ise imzanın kimlik doğrulama mekanizmaları veya mesajın kaynağını doğrulamak gibi o ana ait bir kriptografik hizmeti sağladığı için daha kısıtlı olacaktır. Bu yüzden kuantum tehtidi için anahtar paylaşma probleminin çözümü imza probleminin çözümüne nazaran daha acil bir ihtiyaç olarak görülmektedir. Bu sebeplerle bu çalışmada kuantum sonrası kriptografinin anahtar paylaşma problemine getirdiği çözümler üzerinde durulacaktır. Burada bahsedilen her iki problem için de daha önceden AES'i blok şifreleme algoritması olarak bir yarışma aracılığıyla standartlaştıran NIST (Ulusal Standartlar ve Teknoloji Enstitüsü), standart bir algoritma seçmek için 2016 yılında bir yarışma başlatarak standart bir kuantum sonrası anahtar değişim algoritması ve imza algoritması seçme kararı almıştır. Bu süreçten sonra algoritma tasarımı ve analizi üzerine çalışmalar hızlanmıştır. Yarışmaya katılan algoritmalar dayandıkları zorluklara göre farklı gruplara ayrılabilir. Bunlar kafes tabanlı kriptosistemler, kod tabanlı kriptosistemler, isojeni tabanlı kriptosistemler, özet tabanlı kriptosistemler ve çok değişkenli polinom tabanlı kriptosistemlerdir. Bunların içinden en çok dikkat çeken ve üzerine çalışan homomorfik şifreleme gibi yapılarda da kullanılabilmesi itibariyle kafes tabanlı kriptosistemlerdir. Ayrıca yarışma neticesinde standart olarak seçilen Kyber Algoritması da kafes tabanlı bir anahtar değişim algoritmasıdır. Kod tabanlı kriptosistemler McEliece tarafından ortaya atılılmıştır ve asimetrik kriptografinin en eski problemlerinden biri olan kod çözme problemine dayanmaktadır. Anahtar boyutlarının büyük olması dezavantajından dolayı, kuantum tehditi ortaya çıkana kadar, görece az çalışılan bir alan olarak kalmıştır. Niederreiter kriptosisteminin ortaya çıkması ve anahtar boyutlarında iyileştirmeler yapılması ile kuantum sonrası anahtar değiştirme algoritmaları için önemli bir aday durumuna gelmişlerdir. Izojeni tabanlı algoritmalar klasik bilgisayarlarla kırılabilen bir saldırının ortaya çıkmasıyla riskli duruma düşmüştür. Her ne kadar NIST SIKE anahtar anlaşma algoritmasını alternatif algoritma adaylarına alsa da saldırı bu tarihten sonra ortaya çıktığı için kullanılması önerilmez ibaresi eklemiştir. Ayrıca bazı kriptosistemlerin analizi için üzerine yeterince çalışılmadığı düşünüldüğü için kuantum sonrası anahtar anlaşma algoritmalarının tek başına kullanılmasına şüpheyle bakılmaya başlanmıştır. Bu sebeple anahtar paylaşım problemini çözmek için kuantum sonrası algoritmalar, asimetrik kriptografi ve daha önce paylaşılmış ortak simetrik anahtarlardan en az iki tanesini kullanan hibrit anahtar paylaşım protokolleri de önerilmiştir. Bu çalışma kapsamında anahtar anlaşma problemini çözmek için geliştirilen önerilerden kabaca bahsedilecektir. Kafes tabanlı kriptosistemler ile Kod tabanlı kriptosistemlerin dayandıkları problemler ve bu problemler üzerine anahtar değişim mekanizmalarının nasıl kurulduğu anlatılacaktır. Ayrıca bazı analiz metotları anlatılarak standartta bulunan bazı algoritmaların güvenlik seviyeleri ve analizi hakkında bilgi verilecektir.
-
ÖgeNitelikli elektronik imzaların kullanılabilirliğinin değerlendirilmesi: Sistematize edilmiş kullanım durumları ve tasarım paradigmaları(Lisansüstü Eğitim Enstitüsü, 2024-12-10) Çağal, Mustafa ; Bıçakcı, Kemal ; 707201023 ; Bilgi Güvenligi Mühendisli ˘ gi ve KriptografiEl yazısı imzalara yasal olarak eşdeğer olmalarına rağmen, Nitelikli Elektronik İmzalar (QES) henüz önemli bir pazar başarısı elde edememiştir. QES, kağıt tabanlı sözleşmelere olan bağımlılığı azaltmak, güvenli dijital uygulamaları etkinleştirmek ve kamu hizmetlerini standartlaştırmak konusunda önemli bir potansiyele sahiptir. Ancak, geniş kullanım alanlarına sahip olmasına rağmen kullanılabilirliği hakkında sınırlı çalışma bulunmaktadır. Tez çalışması bahse konu boşluğu gidermek için hazırlanmıştır. Tez kapsamında çalışma benzeri kullanılabilirlik çalışmaları aracılığıyla Nitelikli Elektronik İmzaların güçlü ve zayıf yönlerini değerlendirme gerekliliğini vurgulanmış, QES kullanım durumları sistematize edilmiş, kullanım durumlarını destekleyen tasarım paradigmaları kategorize edilmiştir. Ayrıca, dört farklı QES sistemindeki kullanım durumları üzerinde yürütülen bilişsel gözden gözden geçirme sonucu elde edilen bulgular sunulmuştur. Araştırma soruları şu şekildedir: 1.Türkiye ve Avrupa Birliği genelinde Nitelikli Elektronik İmzaların(QES) tüm kullanım durumları nelerdir? "Kullanım durumları" terimi kullanılırken, standart kullanıcıların yerine getirmesi gereken görev kümesinden bahsedilmektedir. 2.Bu kullanım durumlarıyla ilişkili tasarım paradigmaları, seçenekler ve alt kullanım durumları nelerdir? 3.Bu kullanım durumları ve tasarım paradigmalarını göz önünde bulundurularak, pratik QES sistemlerinin güçlü ve zayıf yönleri ile kullanılabilirlik zorlukları nelerdir? Farklı QES sistemlerinin güçlü ve zayıf yönlerini değerlendirme konusunda bir potansiyele sahip oldukları için, araştırmanın odak noktası olarak Avrupa Birliği ve Türkiye seçilmiştir. QES süreçlerinde yer alan temel aktörler ayrıntılı olarak ele alınmış ve QES süreçlerindeki çok sayıda aktörün kullanılabilirliği etkileyebileceği keşfedilmiştir. QES kullanım durumları, bu kullanım durumlarını destekleyen alt kullanım durumları ve tasarım paradigmaları belirlenmiş ve kategorilere ayrılmıştır. Müteakiben toplamda 36 bilişsel gözden geçirme gerçekleştirilmiştir. Çalışmanın en önemli bulgusu, uzaktan imzaların diğer alternatiflere kıyasla daha kullanılabilir olmasıdır. Nitelikli Elektronik İmzaları standart kullanıcılar için daha çekici bir seçenek haline getirmek maksadıyla, Türkiye'de ve Türkiye benzeri henüz regüle edilmemiş diğer ülkelerde uzaktan imzaların yasallaştırılması gerektiği sonucuna ulaşılmıştır. Bu tezin, Nitelikli Elektronik İmzaların kullanılabilirliği üzerine araştırmaların önemli ölçüde genişletilmesi için bir temel oluşturacağı değerlendirilmektedir.
-
ÖgePrivacy and security enhancements of federated learning(Graduate School, 2024-07-12) Erdal, Şükrü ; Özdemir, Enver ; Karakoç, Ferhat ; 707211008 ; Cybersecurity Engineering and CryptographyFederated Learning has emerged as a revolutionary approach in the field of machine learning, addressing significant concerns related to data privacy and security. Traditional centralized machine learning models require data aggregation on central servers, posing substantial risks of data breaches and privacy violations. FL, on the other hand, distributes the model training process across multiple decentralized edge devices, keeping the raw data localized and mitigating the privacy risks associated with centralized data storage and processing. The motivation for this thesis stems from the growing need to enhance privacy and security in FL applications. As data privacy regulations become more stringent and public awareness of data security increases, there is a pressing demand for robust FL frameworks that can protect sensitive information while maintaining high model performance. FL's ability to leverage the computational power of edge devices, such as smartphones and IoT gadgets, makes it a promising solution for various domains including healthcare, finance, and the Internet of Things. The primary objectives of this thesis are threefold: 1. To provide a comprehensive survey of existing research on privacy-enhanced FL, synthesizing key concepts, methodologies, and findings. 2. To identify gaps, limitations, and open research questions in the current literature on privacy-enhanced FL. 3. To evaluate and compare different privacy-enhancing techniques and methodologies used in FL, assessing their effectiveness, scalability, and trade-offs. FL inherently mitigates several privacy risks by keeping data local to clients. However, it introduces new challenges, particularly related to inference attacks and model update poisoning. Inference attacks exploit model updates to extract sensitive information, while model update poisoning involves malicious clients injecting false updates to corrupt the global model. These challenges necessitate robust solutions to ensure the integrity and privacy of the FL process. Non-IID data and communication overheads further complicate FL implementation. Non-IID data, where data distributions vary across clients, can hinder model convergence and performance. Additionally, frequent and substantial data exchanges between clients and servers result in significant communication overheads, which can strain network resources. Several strategies have been developed to address these privacy and security challenges. Differential privacy introduces noise to data updates, ensuring that individual contributions remain confidential. Protocols that incorporate cryptographic signatures and Secure Multiparty Computation techniques further enhance the security of model updates and ensure data integrity. Co-utility frameworks, which promote mutual benefit between servers and clients, and robust aggregation methods also play vital roles in safeguarding FL systems. Innovative methodologies such as Flamingo and SafeFL leverage advanced cryptographic techniques to provide secure aggregation and enhance privacy preservation. These solutions collectively improve the robustness, efficiency, and security of FL frameworks, enabling their application in real-world scenarios. FL has been applied successfully in various domains, demonstrating its versatility and effectiveness. In wireless communication, FL enhances vehicular communication, localization, and semantic communication by enabling collaborative model training without data centralization. In the IoT sector, FL improves privacy and reduces data transfer costs, with significant applications in smart homes and industrial IoT. Healthcare is another critical area where FL has made substantial impacts. By allowing institutions to collaboratively train models on medical imaging and predictive analytics without sharing patient data, FL addresses stringent privacy regulations while improving model accuracy and generalizability. Studies have shown that FL can maintain high diagnostic accuracy and support personalized medicine. In the financial sector, FL addresses privacy and regulatory challenges by enabling collaborative credit risk assessment and fraud detection. By leveraging data from multiple institutions without centralizing it, FL-based models achieve higher accuracy and adaptability, enhancing the detection of fraudulent activities and improving credit scoring models. Surveys play indispensable roles and offer numerous benefits within the FL domain. They serve as comprehensive repositories of existing research, providing newcomers with a foundational understanding while guiding experienced researchers toward unexplored frontiers. By scrutinizing and synthesizing a plethora of literature, surveys identify emerging trends, highlight successful applications, and outline future research directions. Federated Learning presents a transformative approach to machine learning by enabling decentralized data processing, which addresses critical privacy and security concerns inherent in traditional centralized models. This thesis explored various facets of FL, particularly focusing on the challenges and solutions related to privacy and security, as well as its diverse applications across different sectors. Emerging trends in FL research, including advancements in cryptographic techniques, federated learning frameworks, and regulatory compliance mechanisms, underscore the need for continuous innovation and interdisciplinary collaboration. As FL continues to evolve, it holds the potential to revolutionize secure communication systems and foster a culture of security awareness and privacy by design in machine learning technologies.
-
ÖgePrivacy-preserving authentication methods( 2024-08-23) Baykal Nari, Kübra ; Özdemir, Enver ; 707182004 ; Cybersecurity Engineering and CryptographyThe last century of the technology age has introduced us to many trends that will shape our future. The Internet is not only limited to our computers, but almost every device we use in our daily lives now has an Internet connection. Smartphones, smartwatches, connected cars, smart home technologies, and even smart kitchen appliances are part of the lives of most of us. While the number of IoT devices is measured in billions today, it is an inevitable and expected reality that this number will increase exponentially. These devices that make our lives easier and contribute to our quality of life may not be as innocent as they seem. We share all kinds of personal data with these technological devices: from our sleep patterns to our pulse, from our home temperature to our home/vehicle location, how often we clean our house, what we eat for dinner and more. At this point, concerns about the security of our personal data have a seriously important place. Information security on a system is ensured by the concept called the CIA triad, which includes the concepts of confidentiality, integrity, and accessibility. The concept of information security ensures that data can only be accessed by authorized persons and institutions without compromising its integrity and that it cannot be accessed by unauthorized persons through various security mechanisms. Security mechanisms include various cryptographic algorithms, and the security of these cryptographic algorithms depends on some mathematical problems that are considered hard. However, some of these traditional methods are applicable to devices that do not have any energy restrictions, such as computers or servers. Considering the processing power and energy capacity of devices in an IoV environment, security solutions currently used in information technology will remain dysfunctional. The first step in protecting information is establishing secure communication and properly authenticating the identity of the related person. From the past to the present, cryptographic algorithms have been employed in authentication systems. These algorithms, as mentioned above, are based on the hardness of various mathematical problems. For example, while the security of the RSA algorithm is based on the difficulty of factoring large numbers, the security of the Diffie-Hellman key exchange algorithm is directly proportional to the difficulty of solving the discrete logarithm problem. Although these algorithms ensure the security of the systems at some level, they are quite costly in terms of computational load. Considering the nature of today's technological devices, integrating these algorithms will not be feasible. At this stage, a research area emerges regarding security algorithms to be employed for devices with high mobility and limited resources. The IoV concept, which is a sub-branch of the IoT concept, has become more popular recently, but there is still a lack of studies on the IoV environment. Practically applicable research that can meet the requirements of these devices will shed light on our future. The method proposed within the scope of the thesis targets connected autonomous vehicles, IoV environments, and platooning concepts in IoV environments as its application area. The method proposes a privacy-preserving group-based authentication scheme. The working principle of the proposed method is based on certain pre-defined groups and the communication among these groups. Within the scope of the method, there are components such as vehicles, groups that include vehicles, group managers that manage and conduct the authentication processes in the groups, and RSUs. There are two basic steps in the method for a vehicle to join a group and perform authentication operations: the initial registration phase and the authentication or group handover phase. During the initial registration phase, the vehicle must receive a key pair from a certification authority, this key pair is used in legal situations. The key pair can only be used by legally authorized organizations to access the vehicle in cases such as traffic accidents or malicious usage of the vehicle. During the initial registration phase, the vehicle also receives a key pair containing the group's public and private keys, which are used for subsequent authentication and group handover operations. After the initial registration process, the vehicle is included in a group. The authentication operation for the future group handover process is conducted by the group manager. After the initial registration phase is successfully completed, the vehicle is involved in a group structure with other vehicles located in the same geographical location as the group manager. When the vehicle starts traveling and goes into the coverage area of another target group manager, it sends a group handover request to its own group manager. There is a secure communication channel between group managers, and the private function of the group is shared through this communication channel among these group managers. Then, the targeted group manager shares a temporary nonce value based on the timestamp with the vehicle. Then the vehicle generates a value by combining its secret key with the nonce value. The vehicle uses the generated value as a symmetric key, encrypts the own group secret key with this value, and sends it to the targeted group manager. The targeted group manager decrypts the encrypted data, compares the own calculated data and the vehicle's decrypted data, and thus authenticates the vehicle. The targeted group manager sends the new group information, that is, the group public and private key pair, to the successfully authenticated vehicle. Thus, the vehicle is now included in a new group. All these processes occur in under a millisecond. Therefore, it is a very advantageous method for an IoV vehicle with limited resources. The method is not only applicable candidate in the IoV environment but also in different systems where a group structure can be constructed. Additionally, a protocol for public transportation platoons in smart cities has been proposed as an application of the method proposed within the scope of the thesis. The symmetric key encryption algorithm employed during the authentication phase is left flexible depending on the configuration of the system to be integrated. However, in the proposed method and tests, the AES algorithm was utilized for symmetric key encryption. The scope of the thesis includes a literature review that encompasses many current studies relevant to various vehicle networks. In comparing the proposed method, some of these current methods were implemented. Comparisons are based on real-time analyses, with comprehensive result graphs and tables. Test results reveal the advantages of the privacy-preserving group-based authentication method compared to its alternatives. A detailed security analysis of the method demonstrates that it is an effective candidate for a security solution that is both resistant to known attacks and applicable to IoV systems.
-
ÖgeRostam: A passwordless web single sign-on solution integrating credential manager and federated identity systems(Graduate School, 2023) Mahnamfar, Amin ; Bıçakçı, Kemal ; 909785 ; Cybersecurity Engineering and Cryptography ProgramThe challenge of transitioning to a passwordless future is a multifaceted issue, especially as web applications continue to lean heavily on passwords for authentication. This problem is amplified in enterprise environments, where identity providers, tasked with overseeing federated identity management systems, maintain single sign-on (SSO) services that lack universal compatibility with all applications. To address these complexities, we introduce Rostam, an innovative passwordless solution that integrates credential management with federated identity systems, streamlining access to web applications. Password managers, as per the literature, broadly fall into two primary categories: password wallets and derived passwords. Derived passwords generate unique passwords for websites by amalgamating a master password with supplemental information, such as the target domain name. However, these password managers come with certain limitations, like the need for users to change their existing passwords for websites. Consequently, we have chosen the password wallets category, a prevalent choice among both commercially available and browser-integrated password managers. This approach offers a more secure and user-friendly solution for managing online credentials, allowing users to retain their current credentials securely in encrypted form. Our solution, Rostam, integrates seamlessly through a dashboard, displaying all applications accessible to a user with a single click, after completing a passwordless SSO process. This intuitive interface eradicates the need for users to memorize multiple passwords, simplifying the user experience by centralizing access to diverse applications. We've examined existing works and adhered to essential use cases and design paradigms in credential managers while designing Rostam. For instance, Rostam simplifies the setup process by offering mobile app installation, extension installation, and requiring a cloud account. It accommodates credential registration through both manual and auto-detection methods, and updates credentials manually or through auto-detection, also allowing for manual credential removal. Rostam enhances the user experience by providing various autofill credential options and handling separate subdomains. It also ensures security with manual lock and timed auto-lock features, necessitating the user to reauthenticate themselves. These diverse use cases and paradigms cater to the varied needs of users, underscoring Rostam's comprehensive approach to credential management. Many credential managers focus on thwarting server-related attacks and bolstering privacy through client-side encryption, requiring users to select and remember a potent master password. However, the memorization of such passwords poses a significant challenge. While there have been efforts to counteract this issue, such as using spaced repetition or graphical password schemes, these methods are not as robust as randomly generated long keys. Furthermore, features like the temporary or even permanent storage of the master password to enhance user-experience compromise security, and the ability to alter or reset the master password is absent in some widely used credential managers. Our proposed system prioritizes security by employing a MasterKey, instead of a Master Password, to ensure the safety of encrypted passwords stored in the credential manager. In case of a security breach, encrypted passwords remain secure even if stolen from the server. This security is achieved because all keys, including the MasterKey, are robust, randomly generated, and stored securely on the client-side without any user interference. Furthermore, we employ a dual position technique meaning that to access and recover data, the user needs access to both Rostam's servers and one of the paired devices.
-
ÖgeTwo factor authentication security(Graduate School, 2024-02-02) Kumbasar, Sümeyra ; Özdemir, Enver ; 707201011 ; Cyber Security Engineering and CryptographyThe rapid advancement of technology and the widespread use of mobile applications in various aspects of our lives have started to attract the attention of cyber attackers. As time passes, the scale of cyber attacks on these platforms is increasing, leading to reputation and financial losses for organizations. Therefore, organizations take certain security measures to protect their sensitive and confidential information from unauthorized access and its adverse consequences. Cryptographic algorithms are used to encrypt and transmit sensitive data securely when sending them to the receiving party.To ensure security, certain fundamental security principles are followed. Encryption methods are utilized to ensure the confidentiality and integrity of critical and sensitive data. Additionally, authenticaiton and authorization control are provided for accessing critical data. Authorized users should have access to these data whenever needed, and the actions performed by authorized users should be logged. With the increasing presence of the internet in our lives, the number of attacks on organizations is also rising. One of the most significant among these is DDoS attacks. DDoS attacks aim to disrupt the functioning of systems by sending far more requests than their capacity. The objective of these attacks is not to access, modify, or steal critical and sensitive data but to block and disrupt the accessibility of systems. One of the fundamental elements of resisting cyber attacks and keeping systems secure is authentication. Two-factor authentication (2FA), which is one of the advanced identity authentication methods designed to protect systems from unauthorized access, is commonly used by many organizations as an additional security layer to enhance security. However, these systems have some security vulnerabilities despite enhancing access security. In this master's thesis, we exemplify how two-factor authentication, commonly used for security, can be exploited by attackers to cause denial-of-service attacks by halting the operation of services. There are two steps in the authentication phase: first is the authentication server verifying the username and password, and second is generating a PIN and sending it to the user. In our study, to exemplify the PIN generation phase during authentication, we randomly assign a number to the server and encrypt it using 2048-bit numbers. The speed at which systems perform this operation is in milliseconds. However, as the number of requests received per second increases, the encryption process with 2048-bit numbers becomes more challenging, resulting in a decrease in the server's response rate. Additionally, freezing and slowing down are observed on the device where the program is executed. As a result, the incoming requests eventually exhibit the effects of a DoS attack, negatively impacting the server's performance. We anticipate that in mobile banking applications, simultaneous authentication requests from legitimate users will slow down the application in a similar manner, resembling a Denial-of-Service attack, causing the application to respond to fewer requests and reducing its functionality.