In this research, we study the oxidation process of the Transition Metal Dichalcogenide (TMD) niobium diselenide, NbSe2. TMDs are a category of van der Waals materials, of which samples are...Show moreIn this research, we study the oxidation process of the Transition Metal Dichalcogenide (TMD) niobium diselenide, NbSe2. TMDs are a category of van der Waals materials, of which samples are obtained by exfoliation. The measurements are performed with the Low Energy Electron Microscope (LEEM), which measures reflectivity at different energy, resulting in the so-called IV curves. To analyze oxidation, we have developed a new method to obtain in situ exfoliated flakes in the LEEM set-up. First, we measure in situ cleaved NbSe2 flakes (bulk). The reflectivity upon adding (pure) oxygen is followed. Additionally, the reflectivity of ex situ cleaved NbSe2 flakes is assessed. Here, an intensity boundary between flake’s edge and center is recognized. The reflectivity measurements show that the electronic structure is different across the boundary: a V-shaped reflectivity minimum appears, which sharpens over time upon oxidation. The boundary is further researched with Atomic Force Microscopy and Energy Dispersive X-ray analysis. Additionally, we perform roughness analysis and Principal Component Analysis. The latter provides an alternative method to follow the change in electronic properties over time. We propose that already degraded NbSe2 flakes are more susceptible to oxidatin damage, compared to in situ exfoliated flakes. Upon further measurements, the in situ cleaved samples do not show any degradation signs, therefore we attribute the presence of an intensity contrast with the associated IV features, to oxidized NbSe2.Show less
One of the greatest remaining puzzles in physics is what particle dark matter consists of. For this project, the theory of dark pions is considered, a Hidden Valley model that extends the Standard...Show moreOne of the greatest remaining puzzles in physics is what particle dark matter consists of. For this project, the theory of dark pions is considered, a Hidden Valley model that extends the Standard Model with new, dark particles and a new force, dark QCD. A sensitivity study is performed to determine how many dark pions are expected to be in acceptance of the LHCb detector for Run 2 conditions; the LHCb is well-suited to search for particles in the considered O(1) GeV mass and O(1) - O(100) ps lifetime range. Additionally, a framework has been developed to study the dependence of the sensitivity on a number of theoretical parameters of the dark QCD model, namely the probability to form a dark vector meson instead of a dark pion, the number of colours in dark QCD, the dark QCD scale, and the Higgs mass. It is found that O(100) dark pions are in LHCb acceptance for different track categories, and that the considered the- oretical parameters do not drastically change the number of expected particles (with some small caveats), staying within a difference of about 20%. This is acceptable given the expected experimental uncertainty, showing theory inde- pendent searches for dark pions are possible.Show less
The past decades have shown a rise in skin cancer. This creates the need for prevention and efficient treatment. The most common skin cancer (melanoma) can only be treated when detected early. In...Show moreThe past decades have shown a rise in skin cancer. This creates the need for prevention and efficient treatment. The most common skin cancer (melanoma) can only be treated when detected early. In this thesis we propose a method of increasing awareness for people with a high risk of skin cancer as well as allowing for early detection. Skin cancer is hard to detect, even for experiences healthcare professionals. One of the signals of potential harm full lesions is change over time. We propose to develop an application with which changes in skin lesions can be identified early. By allowing patients to film their body with a mobile phone camera we aim to track the development of lesions. If a patient films their body regularly changes can be detected and the application can urge the patient to consult a dermatologist. In this thesis we explore the possibility of combining the frames of these films into an overview displaying the patients complete back or arm. Combining frames is called stitching. Different stitching techniques found in literature are explored and tested for effectiveness. The optimizations performed are reported and the final result is presented. The location of the different lesions on an overview of the body is needed to show the patient and the healthcare professional where potential harmful lesions are located on the body. This allows for further inspection at the dermatology department.Show less
In this thesis the relative spectral energy density of stochastic primordial gravitational waves is investigated. Decoupling of Standard Model particles and neutrino free-streaming affect the...Show moreIn this thesis the relative spectral energy density of stochastic primordial gravitational waves is investigated. Decoupling of Standard Model particles and neutrino free-streaming affect the expansion history of the universe and thus leave characteristic signatures on the amplitude of the gravitational wave spectrum. Adding extra light or heavy particles damps the spectrum at frequencies before the particle decouples. Including an extra neutrino species amplifies the spectrum at larger wave numbers, but damps it at shorter wave numbers. Measuring these primordial gravitational waves reveals the thermal history of the universe. One possible non-standard thermal history is early matter domination due to the inflaton. It is shown that, in this cosmology, the end of early matter domination and beginning of the radiation era depend linearly on the reheating temperature.Show less
The intergalactic medium (IGM) contains most of the baryonic matter of the Universe and serves as a suitable environment for probing the thermal history of the Universe. The crucial moment in IGM...Show moreThe intergalactic medium (IGM) contains most of the baryonic matter of the Universe and serves as a suitable environment for probing the thermal history of the Universe. The crucial moment in IGM evolution is the Epoch of Reionization, corresponding to the transition from neutral to ionized IGM. However, due to the observational limitations, this period is still not well understood. In this thesis, we focus on constraining IGM thermal history by using Lyman-alpha forests data. This method is applicable in a wide range of temperatures, densities, and ionization fractions of cosmic gas at z about 2 − 5. Observations show that the longitudinal flux power spectrum of the Lyman-α forest exhibits a cut-off at small scales. This phenomenon is caused by thermal Doppler broadening, peculiar velocities along the line of sight (LOS), Hydrogen pressure smoothing, and warm dark matter. The first two effects act only along LOS, while the last two affect all spatial directions. To separate the one-dimensional and three-dimensional effects, we used the method of close quasar pairs, which is based on studying the correlations between Lyman-alpha forests of close quasar pairs. We used the Kolmogorov-Smirnov test to analyze the differences between distributions of phase difference, which characterizes correlations between Lyman-alpha forests. The calculations were performed for various thermal histories, parameters characterizing IGM, LOS separations, and wavenumbers, and accounting for different effects (Doppler broadening and peculiar velocities). Our results indicate that this method can distinguish various thermal histories regardless of the IGM thermal state and one-dimensional effects. Moreover, at separations of the order of pressure broadening, there is a prominent feature caused by different influences of pressure smoothing at large and small scales. In addition, this simple and powerful approach has the potential to distinguish scenarios with warm dark matter.Show less
In this thesis we present an experimental realisation of a double loop type Magnetic Paul Trap. We show that a microgram heavy NdFeB permanent magnet can stably be levitated for hours at room...Show moreIn this thesis we present an experimental realisation of a double loop type Magnetic Paul Trap. We show that a microgram heavy NdFeB permanent magnet can stably be levitated for hours at room temperature in this trap. Magnetic levitation of a magnetized particle is theoretically possible with this trap by generating opposed alternating magnetic fields. We show the fabrication of a printed circuit structure capable of producing these fields, as well as the engineering behind the realisation of the trap. Both by optical and magnetic readout we characterize the motion of the trapped magnet and show that its center of mass motion frequencies $\omega_z = 2\omega_{x,y} \approx 20Hz$. We characterize the damping on these modes and find that at low pressure the quality factor is strongly limited (to $Q \approx 90$) by coupling to the environment through generation of Eddy currents.Show less
In this work, a near-zero stiffness mechanical filter is designed for use in STMs in a cryogenic environment. The filter is a Geometric anti-spring (GAS) filter which consists of a set of blades...Show moreIn this work, a near-zero stiffness mechanical filter is designed for use in STMs in a cryogenic environment. The filter is a Geometric anti-spring (GAS) filter which consists of a set of blades with a payload attached. This design allows for a low resonance frequency of 0.27 Hz and thus a low cutoff frequency in terms of filtering. First, a theoretical model is described in order to determine the relevant properties of the filter and its approximate workings. Second, the model was experimentally verified. From this, the resonance and damping of the filter were found. A limited amount of vibration measurements were also done to check if the filter is functioning as expected, however, due to instrumental limitations this was not conclusive.Show less
This project employs reinforcement learning techniques to explore novel decoding strategies for quantum error correction, particularly focusing on the toric code, to address the challenge of...Show moreThis project employs reinforcement learning techniques to explore novel decoding strategies for quantum error correction, particularly focusing on the toric code, to address the challenge of maintaining stable quantum states for fault-tolerant quantum computing. Two game frameworks are established, including a novel dynamic game framework applicable to the training and measuring of RL agents and potential application in multiagent scenarios. The RL agents use Stable Baselines 3’s Proximal Policy Optimization and show to achieve Minimum Weight Perfect Matching performance on 3 × 3 toric code lattices in both the static and dynamic game frameworks.Show less
This research covers the development and application of multiple low- noise high-bandwidth lockboxes to position the mirrors of four cascaded optical cavities using piezo actuators for frequency...Show moreThis research covers the development and application of multiple low- noise high-bandwidth lockboxes to position the mirrors of four cascaded optical cavities using piezo actuators for frequency locking with a preci- sion of 40 Hz. The design and fabrication of a custom printed circuit board that hosts two ARM-based microcontrollers for signal processing are elucidated. By measuring the impedance of the piezo actuators over a large range of frequencies, we found several electromechanical resonances ranging from a few kHz to a main resonance at 80 − 100 kHz. It was found that these resonances greatly impact the ability to lock the cavity to the laser source and thereby impose bandwidth limitation on the feedback. By avoiding excitation of such resonances by reducing the feedback bandwidth to below the first prominent resonance at 2.5 kHz, we were able to achieve a high quality lock of a single optical cavity. Using a reduced bandwidth of 250 Hz and a reduced modulation frequency of 1.9 kHz, we demonstrated the locking of four cascaded cavities and achieved an optical transmission of T ≈ 40%, limited mostly by optical alignment. Finally, we show an initial lock freeze procedure for three cascaded cavities in which 90% of the transmission during the locked state is retained for a period of 4.9 seconds, while providing no feedback on the piezos.Show less
Understanding the early formation of attitudes towards emerging technologies, such as quantum science & technology (QS&T), is essential for aligning the effect of science communication in...Show moreUnderstanding the early formation of attitudes towards emerging technologies, such as quantum science & technology (QS&T), is essential for aligning the effect of science communication in practice with its intentions. This study explores the application of natural language processing (NLP) techniques to investigate the influence of news articles on social media comments regarding QS&T. We curated a dataset of 217 articles and 14, 391 top-level comments from Reddit. Employing GPT-4, a semi-automated annotation method was developed to label comments for sentiment and engagement towards QS&T, achieving a Cohen’s kappa of 0.82 when pooling over multiple labelling repetitions and comparing with human annotations. We then used support-vector regression to determine if news article embeddings could be used to predict comment sentiment and engagement. After experimenting with various embedding strategies, including Sentence-BERT and RoBERTa models, no significant correlations were found between article content and comment sentiment or engagement towards QS&T. Further interdisciplinary work in empirical communication research and NLP is suggested to explore alternative representations for news articles and their ability to predict perceptions in news comments.Show less
Cancer metastasis remains a critical area of study within the field of cancer research. The tumor microenvironment (TME), comprising various cell types and the extracellular matrix (ECM), plays a...Show moreCancer metastasis remains a critical area of study within the field of cancer research. The tumor microenvironment (TME), comprising various cell types and the extracellular matrix (ECM), plays a pivotal role in controlling tumor initiation and progression. Here we show an investigation into the mechanical phenotype of Hs 578T breast cancer cells within the TME, focusing particularly on the role of cell-ECM interactions in modulating cellular traction forces. Hs 578T cells with an integrin- α2 (ITGA2) stable knockout were utilized, and the resulting pressures were compared between the control and knockout at different collagen concentrations. Attention is hence given to the ITGA2 and its role in mediating cell-ECM interactions. Through the utilization of elastic hydrogel microparticles as localized stress sensors and advanced microscopy techniques, we show that increasing the collagen concentration results in increased traction forces exerted by control breast cancer cells. Conversely, the traction forces by ITGA2 Hs 578T knockout cells remain unaffected by changes in collagen concentration. Also, a linear relationship between the traction and its standard deviation, regardless of the Hs 578T cell type and collagen concentration, is observed. The findings contribute to a deeper understanding of cancer biomechanics, offering insights into potential therapeutic targets for inhibiting metastatic spread in breast cancer.Show less
In this research, a quantum and classical version of a value-based reinforcement learning method are compared to each other. This is done by training each of them to try and learn how to play a...Show moreIn this research, a quantum and classical version of a value-based reinforcement learning method are compared to each other. This is done by training each of them to try and learn how to play a simple game called Fox in a hole. The two models are compared to each other based on performance, training stability, convergence speed, and amount of trainable parameters. After hyperparameter tuning and further experimentation of the models, no clear difference is found between their performances and training stabilities. Nonetheless, the quantum model does seem to converge slower as the dimensionality of the game grows, and it also seems to require longer computation times than the classical model to keep up with its performance. Thus, the results suggest that for the task at hand, a classical value-based RL method is preferred over a quantum version of it.Show less
This report covers a theoretical and experimental investigation on magnetic fields caused by eddy currents of spherically conducting objects. These eddy currents are caused by a time-varying...Show moreThis report covers a theoretical and experimental investigation on magnetic fields caused by eddy currents of spherically conducting objects. These eddy currents are caused by a time-varying magnetic field. First, the magnetic field is determined analytically from the Maxwell’s equations. Then an experiment is done to measure this magnetic field. By comparing the theoretical and experimental results, one can determine the material properties such as the conductivity σ and relative permeability μ_r. For a solid aluminum sphere the experiment gave σ = 4.75 · 10^7 S/m while the theoretical value is 3.767 · 10^7 S/m, thus differing by 26%. For the spherical steel shell, we found σ = 5 · 10^6 S/m and μ_r = 200. This is approximately the same as the estimated theoretical values for hull steel which has σ = 5 · 10^6 S/m and μ_r = 250.Show less
In this research the seasonal variation of the atmospheric muon rate at the KM3NeT/ARCA detector was studied in order to determine the temperature correlation coefficient αT. KM3NeT is a cubic...Show moreIn this research the seasonal variation of the atmospheric muon rate at the KM3NeT/ARCA detector was studied in order to determine the temperature correlation coefficient αT. KM3NeT is a cubic kilometer neutrino telescope consisting of two large volume water-Cherenkov detectors, ARCA and ORCA, located in the Mediterranean sea. The Cherenkov radiation emitted from high energy muons traveling through the seawater gets detected by an assembly of 31 Photo-multiplier tubes situated inside a sphere shaped Digital Optical Module (DOM). 18 of such DOMs are connected to form a Detection Unit (DU). For the ARCA detector, these DUs are located at the seafloor at around 3.5 km depth, extending vertically to about 2.7 km below sea level. Taking advantage of correlations between hits registered at different PMTs within a DOM, a measurement of the atmospheric muon rate at each DOM can be determined. Furthermore, the difference in height of the DOMs in each DU enables the utilization of the depth dependence of atmospheric muons to precisely determine the muon rate at the ARCA detector. This makes it possible to detect rate variations of a few percent. Additionally, the effective temperature is determined through a weighted integral of the available temperature data above the geographic location of the ARCA detector. Comparing the atmospheric muon rate and the effective temperature during the data taking period of 26.09.2021 until 1.06.2022 a temperature correlation of αT = 1.166 ± 0.128 was established. This is slightly above the theoretically predicted value of 0.86. To verify the robustness of the proposed method of determining the rate and temperature correlation, cross checks were done with Monte Carlo files, background signals and the examination of the depth relationship with time. All returned the expected results. However, when the same method was employed on a smaller data set covering the data taking period between 12.05.2021 and 2.09.2021, no significant correlation between the atmospheric muon rate and the effective temperature could be established. Furthermore, the performed cross checks on this data set did not confirm expectations. This is most likely due to the small data set which is not able to accurately capture the long term seasonal effect. However these results should not be neglected. Therefore, while the employed method did return promising results for a larger set of data, more investigation into the efficiency determination and the errors on the fitted slope are needed to confidently verify the reliability of the final result. It is further suggested to revisit this study once a consistent data set of minimally one year is available.Show less
Neural networks are susceptible to minor distortions in their input, which can lead to errors they would not otherwise make. This susceptibility, termed as the network’s robustness, is a crucial...Show moreNeural networks are susceptible to minor distortions in their input, which can lead to errors they would not otherwise make. This susceptibility, termed as the network’s robustness, is a crucial aspect to evaluate. While several methods exist for measuring robustness, they usually suffer from interpretability issues and do not provide a statistical guarantee. In this work, we propose a novel robustness measure that addresses these short- comings by modeling the robustness as a probability distribution and mea- suring its 0.05 quantile. Additionally, previous work suggests the poten- tial modeling of robustness through a log-normal distribution. To eval- uate this hypothesis and its computational benefits, we introduce an es- timator that assumes the distribution is log-normal. A comparison with the standard parameter-free estimator reveals significantly improved com- putational efficiency with the parametrized approach. However, the log- normal assumption requires further research. The assumption is too strong and needs to be relaxed before the parametrized estimator can reliably be utilized.Show less
Learning curves are important for decision making in supervised machine learning. They show how the performance of a machine learning model develops over a given resource. In this work, we consider...Show moreLearning curves are important for decision making in supervised machine learning. They show how the performance of a machine learning model develops over a given resource. In this work, we consider learning curves that model the performance of a machine learning model as a function of the number of data points used for training. For decision making, it is of- ten useful to extrapolate learning curves, which can be done, for example, by fitting a parametric model based on the observed values, or by using an extrapolation model trained on learning curves from similar datasets. We perform an analysis comparing these two techniques with different ob- servations and prediction objectives. When only a small number of initial segments of the learning curve have been observed we find that it is better to rely on learning curves from similar datasets. Once more observations have been made, a parametric model, or just the last observation, should be used. Moreover, we find that using a parametric model is mostly use- ful when the exact value of the learning curve itself is of interest. Lastly, we use this knowledge to improve machine learning on a particle physics dataset.Show less
In the pursuit of designing complex materials with desired properties, un- derstanding their design parameter space is crucial. However, this space’s convolution often hinders comprehension of...Show moreIn the pursuit of designing complex materials with desired properties, un- derstanding their design parameter space is crucial. However, this space’s convolution often hinders comprehension of complex materials’ responses as a function of their design parameters. Machine Learning has recently emerged as a promising tool for capturing patterns in complex design spaces, although this performance often comes at the cost of interpretabil- ity. This thesis aims to explore the design parameter space of interact- ing hysterons using interpretable Machine Learning, specifically Decision Tree inspired methods. Despite the complexity of the design parameter space of even small systems of interacting hysterons, interpretable Ma- chine Learning can classify coarse-grained properties of the system effec- tively. Introducing the Support Vector Classifier (SVC) inspired Decision Tree, we achieve almost perfect isolation of these properties. This model preserves interpretability while effectively probing the statistical structure of design parameter space of systems of interacting hysterons.Show less