We consider two-player stochastic games played on finite graphs with reachability (and Buechi) objectives where the first player tries to ensure a target state to be visited (or visited infinitely often) almost-surely, i.e., with probability 1, or positively, i.e., with positive probability, no matter the strategy of the second player.
We classify such games according to the information and to the power of randomization available to the players. On the basis of information, the game can be one-sided with either (a) player 1, or (b) player 2 having partial observation, or two-sided with (c) both players having partial observation.
On the basis of randomization, (a) the players may not be allowed to use randomization (pure strategies), or (b) they may choose a probability distribution over actions but the actual random choice is external and not visible to the player (actions invisible), or (c) they may use full randomization.
Our main results for pure strategies are as follows:
(1) For one-sided games with player 2 perfect observation we show that
(in contrast to full randomized strategies) belief-based strategies are
not sufficient, and we present an exponential upper bound on memory both
for almost-sure and positive winning strategies;
we show that the problem of deciding the existence of almost-sure and
positive winning strategies for player 1 is EXPTIME-complete.
(2) For one-sided games with player 1 perfect observation we show that
non-elementary memory is both necessary and sufficient for both
almost-sure and positive winning strategies.
(3) We show that for the general (two-sided) case finite-memory
strategies are sufficient for both positive and almost-sure winning, and
at least non-elementary memory is required.
We establish the equivalence of the almost-sure winning problems for pure strategies and for randomized strategies with actions invisible. Our equivalence result exhibits serious flaws in previous results in the literature: we show a non-elementary memory lower bound for almost-sure winning whereas an exponential upper bound was previously claimed.
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system-level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to comprehend the function of complex biological systems. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “traditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science.
One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. In my talk, I will raise a number of challenges commonly encountered in the computational modelling in systems biology. In particular, I am going to concentrate on issues related to model construction methodologies (e.g., parameter estimation, model validation, model identifiability) and I will present some new techniques for the problem of quantitative model comparison. Moreover, I am going to discuss various techniques useful for applying simplifications or extensions to an already fitted and validated mathematical model. As an example of such techniques, I will present the recently introduced concept of model refinement in the context of ODE-based self-assembly models.
Formal, symbolic techniques are extremely useful for modelling and analyzing security protocols. They improved our understanding of security protocols, allowed to discover flaws, and also provide support for protocol design. However, such analyses usually consider that the protocol is executed in isolation or assume a bounded number of protocol sessions. Hence, no security guarantee is provided when the protocol is executed in a more complex environment.
In this talk, we study whether password protocols can be safely composed, even when a same password is reused. We first present how resistance against offline guessing attacks can be modelled in cryptographic process calculi using the notion of static equivalence. We show that resistance against guessing attacks composes in the presence of an eavesdropper (i.e. a passive attacker). However, composition does not preserve resistance against guessing attacks for an active attacker in general. Therefore, we present security enmhancing transformations which allow inter-protocol and inter-session composition for protocols that may reuse the same password. Our result provides an effective strategy to design secure password protocols: (i) design a protocol intended to be secure for one protocol session; (ii) apply our transformation and obtain a protocol which is secure for an unbounded number of sessions and in presence of other password protocols (if these were designed using this transformation).
In the presentation, I will show the formalism that provides the estimate, and support why it is better than the alternative methods. There are straightforward extensions, such as assuming (finite) state-dependent agents, assuming evolving agents, using recommendations, etc. For each of those extensions, the formalisation is easy, but not (immediately) usable. We will see the problems that arise in these extensions.
I will present new metrics we proposed to measure users' query privacy taking into account user profiles. Furthermore, I will also present the spatial generalisation algorithms to compute regions satisfying users' privacy requirements expressed with these metrics. By experimental results, our metrics and algorithms are shown to be effective and efficient for practical usage.
The presentation will expose the theoretical bases and mechanisms of BDMP. Different simple use-cases and quantification examples will then illustrate the relevance of the overall approach. Finally, limits and improvement perspectives will be discussed.
In this talk we will discuss a variation based on correlated data. For instance, to smart phones armed with accelerometers, could be held together in one hand and shaken for some 5-10 seconds. The correlation between the results from the two sensors can be used to confirm the DH key established by using some cryptographic protocol. adversary to mount a man-in-the-middle attack In this talk we will presnet some ideas on how to improve the protocol and widen the scope of possible applications.
A few privacy preserving mechanisms have been proposed in last few years. We found that the attacker can still learn more about users' locations with the data available in the proposed systems, for example, through statistical analysis.
In this project, we want to analyze the effects of statistical analysis from the attacker's view, use some tricks, and if public information, e.g., home address, driving habits can help break users' location privacy.
First, we introduce our formalism named the Process Hitting, and its application to the modelling of BRNs with discrete values. With the aim at simulating and analysing quantitative properties, and as a mean to provide a trade-off between stochastic and timed specifications, we show a generic method to tune temporal properties within stochastic semantics of Process Hitting. Besides its clear biological modelling orientation, we claim that the Process Hitting can be applied to less specific dynamical complex systems.
Finally, we address the large scale analysis of Process Hitting by providing a static analysis by abstract interpretation. Our static analysis brings very efficient over- and under-approximations of reachability properties. The scalability of our approach is illustrated by its application to the decision of reachability of gene expression levels within BRNs of 94 and 104 components. Our implementation responds very fast to the decision, while a well established symbolic model checking techniques regularly fails because of the state space explosion.
I will also present a formalization of automated analysis techniques for the validation of web services specified in BPEL and a RBAC variant tailored to BPEL. The idea is to use decidable fragments of first- order logic to describe the state space of a certain class of web services and then use state-of-the-art SMT solvers to handle their reachability problems. To assess the practical viability of our approach, we have developed a prototype tool implementing our techniques and applied it to a digital contract signing service inspired by an industrial case study.
Before system procurement, a security goal and a threat model were chosen, and we discuss the motivation and thinking behind this choice. An open procurement process was used to decide on one vendor. Implicit in the choice of vendor was also a choice of cryptographic voting scheme. We argue briefly for why the scheme achieves its security goal under the relevant threat model, discuss some of the cryptographic assumptions used in this argument, and discuss some of the non-cryptographic measures taken to secure the system.
(joint work with Nils Bulling)