Quantum mechanics either violates realism (non-determinism) or locality. This follows from Bell’s theorem by showing the measurement statistics of the EPR-experiment can never be fully reproduced...Show moreQuantum mechanics either violates realism (non-determinism) or locality. This follows from Bell’s theorem by showing the measurement statistics of the EPR-experiment can never be fully reproduced by any local hidden variable model (LHV-model). Research has been done in the last couple of decades into the degree to which LHV-models can reproduce measurement statistics approximating the EPR’s statistics. Specifically of interest is finding which amplitudes can be reproduced by LHV-models with an exact correlation. Our analysis and the literature take into consideration: a density matrix separability criterion on the two-electron singlet state, constructs by Kaszlikowski and Zukowski, and the Grothendieck constant. Further, open question arise and new approaches to the locality of quantum states are given.Show less
This bachelor thesis is my completion of the mathematical bachelor program at Leiden University. It is an original, unpublished dissertation, written by me. I really appreciate the people who have...Show moreThis bachelor thesis is my completion of the mathematical bachelor program at Leiden University. It is an original, unpublished dissertation, written by me. I really appreciate the people who have helped me during this project. Not only have I learned a lot about mathematics but I also learned about social topics like the present Electoral Law. I would firstly like to thank my supervisor Richard Gill, professor at the mathematical institute at Leiden University specialized in mathematical statistics. He gave me the opportunity to work on the project in a flexible way and he gave me a lot of appreciation and trust. Without him, I could never have seen the beauty of statistics, I could not have laughed and learned that much at the same time and I could not have completed this interesting project in time. I thank Richard very much for the useful feedback and critical notes and for proposing this project. I hope that we will work together again in the future. I also want to thank my family and friends for supporting and motivating me. It has helped me a lot to be able to talk about my research during the past few months. Not only did they listen, but they also helped me deal with the pregnancy which resulted in less tense feelings. Other people are really appreciated for their assistance. They are Onno van Gaans and Frank den Hollander, professors at Leiden University, dr. C.M. van Driel, Legal Assistant Election Information Point and in addition I really appreciated H. J. J. te Riele for sharing his article about the proportional representation problem in the Second Chamber. I hope you will enjoy reading this thesis.Show less
For the applied statistician, data augmentation is a powerful tool for solving optimization problems. In this thesis, I address a problem in some data augmented Gibbs samplers. I show that although...Show moreFor the applied statistician, data augmentation is a powerful tool for solving optimization problems. In this thesis, I address a problem in some data augmented Gibbs samplers. I show that although introducing latent variables renders a sampling problem tractable, this comes at the price of raising the autocorrelation of the Markov chain, as the number of parameters increases, in this case the number of items in a test. By means of an example, I show that data augmentation is a powerful yet inefficient tool in cases of increased number of items, since the autocorrelation (and hence the rate of the convergence) of the addressed augmented Gibbs sampler is proved to be dependent on the number of item parameters. We wish to show that although most data-augmented samplers are well behaved, in this example the algorithm becomes really slow and faces the possibility of grinding to a haltShow less
The introduction of the Rényi entropy allowed a generalization of the Shannon entropy and unified its notion with that of other entropies. However, so far there is no generally accepted conditional...Show moreThe introduction of the Rényi entropy allowed a generalization of the Shannon entropy and unified its notion with that of other entropies. However, so far there is no generally accepted conditional version of the Rényi entropy corresponding to the one of the Shannon entropy. Different definitions proposed so far in the literature lacked central and natural properties one way or another. In this thesis we propose a new definition for the conditional case of the Rényi entropy. Our new definition satisfies all of the properties we deem natural. First and foremost, it is consistent with the existing, commonly accepted, definition of the conditional Shannon entropy as well as with the right notion of the conditional min entropy. Furthermore, and in contrast to previously suggested definitions, it satisfies the two natural properties that are monotonicity and (weak) chain rule and which we feel need to be satisfied by any ‘good’ entropy notion. Another characteristic of our new definition is that it can be formulated in terms of the Rényi divergence. Additionally, it enables the use of (entropy) splitting. We conclude with an application where we use our new entropy notion as a tool to analyze a particular quantum cryptographic identification scheme.Show less
The topic of this Master Thesis is quantitative methodologies for optional features and bundles. The frame is the one of Quantitative Marketing Research, a field whose goal is to give market...Show moreThe topic of this Master Thesis is quantitative methodologies for optional features and bundles. The frame is the one of Quantitative Marketing Research, a field whose goal is to give market intelligence in forms of, among others, market shares, population clustering and scenario simulations. The particular problem we have worked on is the one of optional features and bundles i.e. services that can be selected for an extra price when purchasing a product. The technique we have used in our analysis is a discrete choice model, Choicebased Conjoint. The content of this thesis is based on an internship at the international market research company SKIM. The internship was jointly supervised by Senior Methodologist Kees van der Wagt (SKIM) and Prof. Dr. Richard Gill (Mathematisch Instituut Leiden). The two most important results of the thesis are new methodologies to study products with optional features and bundles. These methodologies produce utilities that match the respondent’s observed choices. Only knowing the estimated utilities, we are able to answer the questionnaire producing answers similar to the observed ones. The methodologies enjoy all typical properties of conjoint methodologies and can be used to calculate market shares, simulate scenarios etc. Their most interesting feature is that it is possible to tell if offering an option makes a product too complicated. They can also tell if their simple presence makes the product more appealing (halo effect). As far as we know, this is the first study in this promising field. The methodologies we propose are tested on two different datasets arising from studies conducted by SKIM. They have been developed with tests on simulated datasets. The software of choice for the estimation procedure was Sawtooth’s implementation of CBC HB. For reproducibility of experiments we also wrote a package in the open source language R reproducing the same algorithm. This package and Matlab codes used in simulations are found in the Appendix.Show less
Since the introduction of quantum mechanics, countless many have racked their brains over the interpretation of the theory. An answer to the measurement problem is part of the interpretation and...Show moreSince the introduction of quantum mechanics, countless many have racked their brains over the interpretation of the theory. An answer to the measurement problem is part of the interpretation and consequently the problem has frequently been discussed. Some however do not see a measurement problem at all and hence the measurement problem problem was born: Does the measurement problem exist at all? Belavkin acknowledges a problem and has described a way to solve it in a difficult paper in 2007 (see [1]). His theory is not widely known, probably due to the complexity of his paper. Therefore it seems important to explain Belavkin’s framework in the most simple way and applied to the most simple examples to help it become more well-known and understood. Goal of this thesis is to formulate Belavkin’s framework in an easy way. In addition we try to understand several much used models in the quantum theory using his framework. This thesis is not an attempt to reconstruct or simplify the whole theory of Belavkin, only the discrete time and not the continuous time part. At the same time this is not a literature review of other ways to solve the measurement problem. The thesis has the following structure. The first section gives our view on the measurement problem. That gives directly our take on the measurement problem problem. Belavkin’s framework as we will use it in this thesis is constructed in the second section; it is used then in section 3 to give a simple but important application. The main part of the thesis is section 4 which shows that the master equation approach, often used in quantum optics, can be embedded in Belavkin’s theory. The constructive way we do this gives us machinery to give a Belavkin-type solution to the problem of modeling a Geiger counter.Show less
In this thesis I’ll discuss Conditional Independencies of Joint Probability Distributions (here after called CI’s respectively JPD’s) over a finite set of discrete random variables. Remember that...Show moreIn this thesis I’ll discuss Conditional Independencies of Joint Probability Distributions (here after called CI’s respectively JPD’s) over a finite set of discrete random variables. Remember that for any such JPD we can write down a list of all CI’s, between two subsets of variables given a third. Such a list is called a CI-trace. An arbitrary list of CI’s is called a CI-pattern, without a priori knowing if there will exist a corresponding JPD with this CI-pattern. For simplicity and without loss of generality we take all JPD’s over n + 1 variables and label them by the integers 0, 1, . . . , n. A CI-trace now becomes a set of triples consisting of subsets of [n], the random variables (with [n] I denote the set {0, 1, . . . , n}). For example (A, B, C) with A, B and C ⊂ [n] is such a triple, it can also be denoted as A⊥B|C, which means that the random variables of A are independent of the random variables of B given any outcome on the random variables of C. It was believed that CI-traces could be characterised by some finite set of rules, called Conditional Independence rules, CI-rule. Such a CI-rule would state that if a CI-trace contains a certain pattern of triplets it should also contain a certain other triple. Furthermore such a pattern of a CI-rule should itself be finite; it should consist of k CI’s, called the antecedents that would validate another k + 1’th CI, called the consequent. The order of a CI-rule is the number k of its antecedents. This idea would imply that the set of all CI-traces is equal to the set of all CI-patterns closed under the CI-rules. In 1992 Milan Studen´y wrote an article on this subject called Conditional Independence Relations have no finite complete characterisation. He proved that such a characterisation is not possible. Now the main goal of my thesis was to understand this article and to work out a readable version of the theorem and the proof. The proof is based on two major parts. First of all the existence of a particular JPD and its CI-pattern on n + 1 variables and secondly on a proposition about CI-patterns based on entropies. The remainder of my thesis will contain sections on these two major parts, Studen´y’s theorem and a small summary of the changes I made.Show less
Deze scriptie gaat over de statistische berekeningen die gebruikt zijn voor de rechtszaak van Lucia de B. Zij is in juni 2004 door het gerechtshof in Den Haag veroordeeld voor 7 moorden en 3...Show moreDeze scriptie gaat over de statistische berekeningen die gebruikt zijn voor de rechtszaak van Lucia de B. Zij is in juni 2004 door het gerechtshof in Den Haag veroordeeld voor 7 moorden en 3 pogingen tot moord. Zij heeft hiervoor levenslang en TBS gekregen. Zij heeft in verschillende ziekenhuizen gewerkt waaronder het Juliana Kinderziekenhuis (JKZ) en het Rode Kruis Ziekenhuis (RKZ). De statisticus dr. Elffers is door de rechter gevraagd om een statistisch rapport te schrijven over de zaak. In eerste instantie heeft hij alleen berekeningen gedaan voor het JKZ, omdat alleen van dat ziekenhuis de gegevens vrij gegeven waren. In dit ziekenhuis zijn ze haar gaan verdenken, omdat er erg veel incidenten tijdens haar diensten plaats vonden (onder incidenten worden sterfgevallen en reanimaties verstaan). Op verzoek van de rechter zijn later ook berekeningen gedaan voor twee afdelingen van het RKZ waar ze in dezelfde periode gewerkt heeft. Men vroeg zich af of het toeval zou kunnen zijn dat Lucia betrokken was bij al die incidenten, terwijl ze onschuldig was. De rechter heeft tijdens de rechtszaak aan dr. Elffers gevraagd wat de kans is dat het toeval zou kunnen zijn dat Lucia bij zoveel incidenten aanwezig was. Dr. Elffers heeft uitgerekend wat de kans is dat een willekeurig persoon betrokken kan zijn bij zoveel incidenten, als de incidenten volgens toeval gebeuren. Waarom zijn deze berekeningen zo belangrijk en welke invoeld hebben ze tijdens de rechtszaak gehad? Statistici geloofden dat er medisch bewijs was voor de moorden en de medici geloofden dat daar statistisch bewijs voor was. Dit heeft er toe geleid dat de rechter Lucia schuldig heeft bevonden. In deze scriptie zullen we kijken naar de berekeningen die dr. Elffers gedaan heeft, de aanmerkingen daarop en mogelijke verbeteringen. Hiervoor worden alternatieve statistische toetsingsgrootheden besproken en met elkaar vergeleken.Show less