This thesis discusses the games Hex, and two variants Cylindrical Hex and Torus Hex. We start by giving the rules of the games, and showing that no tie can take place, meaning that there will...Show moreThis thesis discusses the games Hex, and two variants Cylindrical Hex and Torus Hex. We start by giving the rules of the games, and showing that no tie can take place, meaning that there will always be a winner. After that we discuss some existing strategies for Cylindrical Hex and program a Pure Monte-Carlo player to play this game. From smart strategies observed from the Pure Monte-Carlo player, a new strategy is determined for Cylindrical Hex. To test this new strategy, experiments are carried out and discussed.Show less
Compactifications in mathematics date back to the late 19th and early 20th centuries. Maurice Fr´echet and Felix Hausdorff both laid the foundational work on compact spaces, the concept that every...Show moreCompactifications in mathematics date back to the late 19th and early 20th centuries. Maurice Fr´echet and Felix Hausdorff both laid the foundational work on compact spaces, the concept that every open cover has a finite subcover. The notion of compactification emerged as an extension of these compact spaces. The motivation behind compactifications was to find ways to extend a given space in a topological sense by adding limit points or ’points at infinity’. The process of compactification in topology entails changing a given topological space into a compact space, and there exist various methods to achieve this goal. The idea underlying compactifications is the embedding of the original topological space into a compact one. Within the scope of this thesis, our focus lies specifically on metric compactifications, which involve the embedding of metric spaces into compact spaces. Notably, in the case of the real numbers, this process entails the addition of the points ? and ??. In the first chapter, we introduce some fundamental concepts and results that are necessary for this thesis. Then, we define the compactification of metric spaces in the second chapter, where we will give an example of the real numbers. We will also demonstrate how we can extend isometries to homeomorphisms on metric compactifications. In the final chapter, we consider the metric compactification of the Euclidean d-dimensional space equipped with the p-norm. This compactification is particularly interesting due to the fact that we can explicitly compute the ’points at infinity’. To determine these points, we utilize the fact that the space is metrizable.Show less
In survival analysis, a competing risk model is a statistical method used to analyze time-to-event data in situations where multiple events of interest may occur and compete for occurrence. The...Show moreIn survival analysis, a competing risk model is a statistical method used to analyze time-to-event data in situations where multiple events of interest may occur and compete for occurrence. The events are considered ‘competing’ because the occurrence of one event prevents the occurrence of other events. Traditional survival analysis focuses on a single event of interest, such as death due to a particular cause. However, in real-world scenarios, there can be multiple events that individuals in a study population might experience. These events can have different causes. For example, in a study involving cancer patients, the events of interest could be death from cancer, death from other causes, and disease recurrence. In survival analysis, a cure model is a statistical model used when a proportion of the study population is considered ‘cured’, meaning they will never experience the event of interest. This concept is particularly important when studying diseases with a good prognosis. A notable example is paediatric oncology, where patients may be considered cured if they experience long event-free survival. Despite the growing recognition of the significance of considering cured fractions in statistical analysis, there remains limited research on the theoretical aspects of combining competing risks and cure models. The integration of these two approaches has not been extensively studied until now. This research aims to fill the existing gap in the field by focusing on the concept of identifiability. First, a general model that involves two competing events and cause-specific cure for both events is considered. The main objective is to identify the model parameters, particularly the dependence relationship between the two cure status indicators. A logistic model to estimate cure probabilities and a semi-parametric Cox model to assess cause-specific hazards (or subdistribution hazards) are employed. The results demonstrated that, under appropriate assumptions, certain parameters can be effectively identified. However, it is also revealed that the model becomes unidentifiable without these specific assumptions. It is further shown that the models previously proposed in the literature can be seen as special cases of this general model. The thesis presents a novel estimation procedure for the general model, utilizing the EM (ExpectationMaximization) algorithm. The flexibility of this procedure allows it to be applied to special cases of the model. Two simulation studies were conducted to investigate the performance of the estimation procedure and to study the practical identifiability properties of the model for cure and competing risks. The results showed good performance for most parameters of the model. In conclusion, this thesis provides valuable insights into the practical identifiability of parameters 2 through both theoretical and simulation-based analyses. This research significantly contributes to a better understanding of competing risks and cure models. The understanding of these statistical methods enables more accurate analysis of patient outcomes and treatment effects in diverse clinical and non-clinical contexts. Ultimately, this research positively impacts the field, facilitating better decision-making and improving overall outcomes for patients and individuals in various settings.Show less
This research was performed to determine which type of model gives a better prediction of the drugs transport into and inside a tumor cell. Two types of models have been developed with compartment...Show moreThis research was performed to determine which type of model gives a better prediction of the drugs transport into and inside a tumor cell. Two types of models have been developed with compartment modeling using the partition coefficient and concentration gradient of a compound. We compared the simulations of these models with varying the drug’s partition coefficient. The model with intermediate steps to cross the membrane takes more time than the model without these steps to reach the partition in equilibrium of the drug in the several compartments. This suggests that the model including these steps give a better prediction of real life drug transport. From a certain value for the partition coefficient, the drug does not enter the cell faster if we increase this value. These results suggest that the model with the intermediate steps is most effective for modeling data for different compounds to test whether they would be suitable drugs to treat cancer.Show less
This thesis discusses a lesser known formulation of quantum mechanics called deformation quantisation. This theory provides a more intuitive way to study quantum systems, in which confusing topics...Show moreThis thesis discusses a lesser known formulation of quantum mechanics called deformation quantisation. This theory provides a more intuitive way to study quantum systems, in which confusing topics such as the relation between the operator commutator and the classical Poisson bracket and the concept of classical limit become very obvious. After a short introduction, Chapter 2 introduces the mathematical foundation of Hamiltonian classical mechanics, symplectic geometry, and states the Darboux Theorem. Chapter 3 develops the theory of Hochschild cohomology and gives a classification of the cohomology spaces of the algebra of smooth functions on a manifold. In Chapter 4, deformations of algebras are defined and results of Hochschild cohomology are used to prove lemmas about star products, which are smooth deformations of the algebra of smooth functions on a manifold. And Chapter 5 introduces the special case of the Moyal star product and shows how it can be used to obtain deformation quantisation. In this chapter, it is also shown that deformation quantisation is completely equivalent to the Hilbert space formalism which is traditionally taught in undergraduate studies, and the simple harmonic oscillator is treated as an example. Finally, Chapter 6 summarises the possible benefits and drawbacks of teaching deformation quantisation instead of the Hilbert space formalism and lists some avenues of further studyShow less
In tonal languages such as Mandarin Chinese, the meaning of a word depends on the pitch variation of the tone. Since tones are often not pronounced in isolation, but rather concatenated,...Show moreIn tonal languages such as Mandarin Chinese, the meaning of a word depends on the pitch variation of the tone. Since tones are often not pronounced in isolation, but rather concatenated, neighboring tones effect each other. This gives rise to tonal coarticulation. In this thesis, we will explore if, given two concatenated tones of the Mandarin word “ma”, it is possible to predict the following tone on the basis of the coarticulation effect present in the first tone, and vice versa. The phonetic data used for this exploration hold a certain intrinsic smoothness that points naturally towards the functional data analysis domain as a tool to study them. Therefore, we will be using multiple functional data analysis techniques. We will start with k-means clustering on the raw data with the Euclidean distance and the Manhattan distance. Afterwards, we will study the effect on tone duration, for which we will be using duration analysis. Furthermore, previous research indicates that the coarticulation effect lies at the level of covariances. Hence, we will also be clustering functional covariances. In the last section, indications of the results will be discussed and suggestions will be made for further research. Lastly, plots obtained from the analyses are shown in the appendixShow less
We study the mathematics and physics involved in the generation of gravitational waves by stellar mass binary black holes and their subsequent detection by LISA, a space based interferometer...Show moreWe study the mathematics and physics involved in the generation of gravitational waves by stellar mass binary black holes and their subsequent detection by LISA, a space based interferometer detector. We show that LISA will be capable of detecting nearby binary black holes with a maximal relative distance error of 0.2 and skylocation error of 1 square degree if the total mass of the binary is at least eighty solar masses.Show less
Directed topology is a fairly new field of mathematics with applications in concurrency. It extends the concept of a topological space by adding a notion of directedness in which directed paths...Show moreDirected topology is a fairly new field of mathematics with applications in concurrency. It extends the concept of a topological space by adding a notion of directedness in which directed paths play a very important role. There are direction preserving maps between directed spaces called directed maps. A special case of these is a directed path homotopy that transforms one directed path into another. Using these deformations, directed paths are partitioned into equivalence classes and a special category, the fundamental category, can be linked to a directed space. In this thesis we will explain these definitions and present a special theorem: a directed version of the Van Kampen Theorem. This theorem allows the calculation of fundamental categories by combining local knowledge about paths. Our main contribution is the formalization of this material using the Lean proof assistant and we show how we have implemented this.Show less
Quantum computing has the potential to revolutionise the field of cryptography. Quantum money is a cryptographic scheme that attempts to create unforgeable currency. This thesis investigates the...Show moreQuantum computing has the potential to revolutionise the field of cryptography. Quantum money is a cryptographic scheme that attempts to create unforgeable currency. This thesis investigates the knot-based quantum money scheme proposed by Farhi et al.[FGH+12], which assumes that finding transformations between equivalent knots is computationally demanding. We start by providing a comprehensive understanding of the relevant concepts of knot theory, particularly the Alexander polynomial. Next, we discuss the proposed quantum money scheme. Finally, we discuss implementation challenges on a quantum simulatorShow less