PROGRAM PRELIMINAR (PENTRU REZUMATE VEZI MAI JOS):
Marţi, 26 aprilie 2016
Miercuri, 27 aprilie 2016
9:00 – 10:30 
Sesiunea V: Cryptography, Trust, Verification. Bogdan Aman (INFOIaşi): Timed migration and interaction in distributed systems. Stefan Ciobâcă (UAIC Iaşi): Trusted algorithms and implementations. Bogdan Warinschi (Bristol): Cryptographic techniques for electronic voting. 
10:3011:00 
Pauză de cafea 
11:0012:30 
Sesiunea VI: Logic and Symbolic Computation. Ioana Leuştean (UNIBUC): Nuances of truth in manyvalued logic. Mădălina Eraşcu (UVT): Using Real Quantifier Elimination for synthesizing optimal numerical algorithms. Tudor Jebelean (Johannes Kepler U, Linz): Proofbased Synthesis of Sorting Algorithms for Trees 
12:3014:00 
Pauză de masă 
14:0015:30 
Sesiunea VII: GameTheoretic Models Matei Popovici (UPB): Using game theory to understand clientISP behavior in the presence of Multipath TCP. Marcel Cremene (UT Cluj): The Social Honesty game – a model of honest behavior dynamics. Cosmin Bonchiş (UVT): Worstcase fairness in TUcooperative games. 
15:3016:00 
Pauză de cafea 
16:00 – 17:30 
Masă rotundă: Cercetarea românească in informatică: prezent, oportunităţi, probleme. – Gabriel Istrate (moderator), – prof. Dana Petcu (prorector UVT, director, Institutul eAustria.) – prof. Viorel Negru (preşedintele senatului UVT, membru al comisiei de informatică a CNATDCU) – prof. Radu Marinescu (fost vicepreşedinte al CNCS) – Răzvan Florian (exconsilier al ministrului invăţământului) 

Eveniment de Networking (organizat la nivelul conferinţei) 
REZUMATE:
Bogdan Alexe (Univ. Bucuresti): Measuring objectness and difficulty for visual learning
Abstract: The longterm goal of computer vision is to develop computer based systems with similar human vision capabilities for achieving the dream of automatic visual understanding of the world. A prerequisite in building seeing machines that analyze the visual information and reason about the environment as humans effortlessly do is visual object detection. In this talk I will present a classgeneric object detector and an image difficulty predictor and their use to reduce the degree of supervision for learning object classes. In the first part of my talk I will present an objectness measure that quantifies how likely it is for an image window to contain an object of any class. In the second part of the talk I will present two applications that use objectness to facilitate learning object classes in weakly supervised scenarios. Given a set of training images known to contain objects a certain class, we learn models capable of localizing objects in new test images. Lastly, I will present a measure of estimating image difficulty defined as the human response time for solving a visual search task and show how it improves the weakly supervised object localization task.
Dan Alistarh (Microsoft Research): Structuri de Date ale Viiitorului: Concurenta, Relaxare si Optimism
Un trend clar in informatica ultimilor ani este nevoia de a procesa cantitati sporite de date in mod eficient. Aceasta crestere substantiala a volumului de date schimba in mod radical felul in care softwareul si hadwareul sunt gandite, implementate, si folosite de catre programatori. In aceasta prezentare, o sa dau exemple de noi structuri de date inspirate de acest trend. Mai precis, o sa discut despre limitele inerente ale structurilor de date clasice intrun mediu paralel, si despre modalidati de a evita aceste limitari.
Prima metoda este relaxarea garantiilor oferite de catre structurile de date, prin aproximare sau randomizare. A doua metoda consta in schimbari ale arhitecturii hardware pentru a permite mai mult paralelism. In limita timpului disponibil, voi discuta si despre principalele directii de cercetare pentru viitor.
Biografie: Dan Alistarh este in prezent cercetator la Microsoft Research Cambridge. Inainte de MSR, Dan a fost postdoc la MIT, unde a lucrat cu Nir Shavit si Nancy Lynch. Dan sia obtinut PhDul de la EPFL, si e interesat in principal de algoritmi distribuiti si structuri de date paralele.
Bogdan Aman (Iasi). TIMED MIGRATION AND INTERACTIONS IN DISTRIBUTED SYSTEMS
Abstract: We use process calculi with an explicit locations, local clocks, timed migration and interactions to model distributed systems. What is called now ‘TiMo family’ contains several calculi developed during the last years: a basic framework TiMo, a probabilistic extension pTiMo, a realtime version rTiMo, access permissions in PerTiMo. A formal relationship between rTiMo and timed automata allows the use of model checking capabilities provided by UPPAAL. A probabilistic temporal logic called PLTM was introduced to verify properties of pTiMo processes making reference to temporal constraints over local clocks and multisets of actions.
Florin Bălaşa (The American University in Cairo):
From Behavioral Specifications to LowPower Data Storage
in Embedded Signal Processing Applications}
In embedded telecommunication and multimedia processing applications, the manipulation of large datasets has a major effect on both power consumption and performance of the system. Due to the significant amount of data transfers between the processing units and the large and energy consuming offchip memories, these applications are often called datadominated or dataintensive.
This work describes a computeraided design methodology for hierarchical memory architectures in embedded dataintensive applications from the area of multidimensional signal processing. The main input of this memory management framework is the behavioral specification of a given application. Different from the previous works, three memory management tasks – the data assignment to the memory layers, the mapping of signals to the physical memories, and the banking of the onchip memory – are addressed in a consistent way, based on a same formal model. The main design target is the reduction of the static and dynamic energy consumption in the memory subsystem, but the same formal model and algorithmic principles can be applied for the reduction of the overall time of access to memories, or combinations of the design goals.
Dietmar Berwanger (CNRS, Université ParisSaclay): Coordination under Uncertainty.
Much of what computers do depends on what other computers do. The task of coordinating processes so that each one does its part towards achieving a global objective raises intriguing algorithmic problems. In this talk, we focus on the question of how to overcome the uncertainty about global conditions of action based on local, imperfect information.
First, we illustrate the complexity of designing coordination strategies in distributed systems considering the simple but quintessential problem of reaching consensus without communicating. Then we present automated synthesis procedures for systems where the information flow follows different hierarchical patterns.
Adriana Birluţiu (Alba Iulia): Transfer Learning from a Machine Learning Perspective
Transfer learning has emerged as a new machine learning framework that investigates how to transfer knowledge from auxiliary source domains to facilitate a new learning task in the domains of interest. The idea behind transfer learning is that the involved domains share some common latent information (transfer factors or transfer components), which can be uncovered and exploited using different techniques as the bridge for knowledge transfer. Traditional machine learning techniques try to learn each task from scratch, while transfer learning techniques try to transfer the knowledge from some previous tasks to a target task when the latter has fewer highquality training data. Given a target task, the question is how to identify the commonality between the target task and previous (source) tasks, and how to transfer knowledge from the previous tasks to the target one? I will present several approaches to transfer learning and discuss some applications.
Cosmin Bonchiş (UVT): Worstcase fairness in TUcooperative games.
Stefan Ciobică (UAIC Iaşi): Trusted Algorithms and Implementations.
Abstract: As our society becomes increasingly dependent on computing systems, the failure of such systems due to bugs can lead to serious consequences ranging from minor economic losses to loss of life. In order to prevent the failure of computing systems, software companies employ several quality assurance techniques to catch bugs before they reach production. Depending on how missioncritical the software is, quality assurance techniques range from simple testing to certified formal proofs of correctness. In this talk, we argue for the need of formal methods to verify algorithms and implementations and we survey the existing formal verification methods.
Adrian Crăciun (UVT): Proof Complexity of Topological Coloring Principles.
Marcel Cremene (UT Cluj): The Social Honesty game – a model of honest behavior dynamics
Social (honest/dishonest) behavior is analyzed using a spatial game called ‘Social Honesty’ (SH) within the strategic interaction framework of Cellular Automata and computational Evolutionary Game Theory. Probabilistic punishment is used as a dishonesty deterrence mechanism. In order to capture the intrinsic uncertainty of social environments, payoffs are described as random variables. Dynamics, with a relation between punishment probability and punishment severity, are revealed. Punishment probability proves to be more significant than punishment severity. Critical values and transition intervals for punishment probability and severity are identified. Clusters of honest/dishonest players that emerge spontaneously from the very first rounds of interaction are determinant for the future dynamics. Smallworld neighborhood topology, asynchronous updating, player identity, neighborhood diversity, and noise effect are analyzed. The existence of a narrow transition interval from dishonest to honest behavior represents an invariant aspect in all experiments. Cluster size is influenced by neighborhood regularity, player identity and the presence of random noise. Conformism and identity slow down the system evolution but make it more stable and help cluster formation and growth. The conformism impact is nonlinear: a low conformism level does not promote honesty, a high conformism level favors the spread of honesty/cooperation but when approaching to 100\% conformism the ability of the system to evolve is negatively affected.
Liviu P. Dinu (UNIBUC): The evolution of Romanian: between genetic and cultural influence.
Mădălina Eraşcu (UVT) Using Real Quantifier Elimination for Synthesizing Optimal Numerical Algorithms.
Various problems in mathematics, science and engineering can be reduced to real quantifier elimination (quantifier elimination over real closed fields). Around 1930, Alfred Tarski showed that real quantifier elimination can be carried out algorithmically. In 1975, George Collins provided a dramatically more efficient algorithm. Since then, there has been intensive research to make further improvement of efficiency for a certain important class of formulas (inputs). In our research, we consider formulas arising in the synthesis of optimal numerical algorithms. In this talk, we present a case study on the square root problem: given the real number x and the error bound eps, find a real interval such that it contains sqrt(x) and its width is less than eps. As usual, we begin by fixing an algorithm schema, namely, iterative refining: the algorithm starts with an initial interval and repeatedly updates it by applying a refinement function, say R on it until it becomes narrow enough. Then the synthesis amounts to finding a refinement function R that ensures that the algorithm is correct (loop invariant), terminating (contraction), and optimal. All these can be formulated as quantifier elimination over the real numbers. Hence, in principle, they can be all carried out automatically. However the computational requirement is so huge, making the automatic synthesis practically impossible with the current general quantifier elimination software. Hence, we did some hand derivations and were able to synthesize semiautomatically optimal algorithms under suitable assumptions. This is joint work with Hoon Hong.
Marc Frincu (UVT): TBA
Gabriel Istrate (UVT): TBA
Tudor Jebelean (Johannes Kepler U, Linz): Proofbased Synthesis of Sorting Algorithms for Trees
joint work with:
Isabela Dramnesc
West University, Timisoara, Romania
Sorin Stratulat
Universite de Lorraine, Metz, France
We develop various proof techniques for the synthesis of sorting algorithms
on lists and on binary trees. Appropriate induction principles are designed and
various specific proof methods are experimented, mixing rewriting with
assumptionbased forward reasoning and goalbased backward reasoning
a la Prolog. The proof techniques are implemented in the Theorema system and
are used for the automatic synthesis of several algorithms for sorting and
for the necessary auxiliary functions.
Moreover we formalize and check some of the algorithms and some of the
properties in the Coq system.
Ioana Leuştean (UNIBUC) Nuances of truth in manyvalued logic.
(Joint work with Denisa Diaconescu)
Nuances of truth represent a robust paradigm in the framework of manyvalued logics. The idea of nuancing goes back to Gr.C. Moisil and it is the core of LukasiewiczMoisil logic. Moisil’s determination principle states that a manyvalued element is uniquely determined by a sequence of Boolean elements, its nuances. Consequently, there is a categorical adjunction between Boolean algebras and LukasiewiczMoisil algebras.
In this talk we overview the main features of LukasiewiczMoisil nvalued logic and we analyze it in the modern context of manyvalued logics.
Matei Popovici (UPB): Using game theory to understand clientISP behaviour in the presence of Multipath TCP.
Multipath TCP (MPTCP) is a communication protocol which allows applications to use several Internet links as a single connection, thus increasing perceived throughput and ensuring reliability.
In a world where MPTCP is likely to be the new layer4 standard, Internet Service Providers (ISPs) may manipulate the amount of traffic they forward for clients.
We use a gametheoretic approach for modelling and understanding clientISP and ISPISP interaction in an Internet market. Our model is a multileader multifollower game (multiplayer Stackelberg game). We use the subgame perfect equilibrium in order to predict optimal client and ISP behaviour in such markets.
GheorgheCosmin Silaghi (UBB). Dependable resource management in heterogeneous distributed systems
Nowadays, building federated computing infrastructures based on various resources gathered from heterogeneous stakeholders and transparently delivering the aggregated computing power represent major challenges in distributed systems. With such approaches, one can benefit from cheap commodities donated voluntarily, or from some strong specialization of a given stakeholder with a specific technology. Smooth QoS delivery implies mechanisms for economicbased resource scheduling, trading off the contradicting interests of consumers vs. suppliers and tools towards dependability, in order to transparently deal with malicious providers. We will present several research results dealing with reputation management and sabotage tolerance on the dependability side and with SLA negotiation and economicenhanced scheduling on the resource management side.
RozicaMaria Tache (UNIBUC): Extremal graphs with respect to some topological indices
A topological index is a graph invariant that characterizes it’s branching which plays an important role in mathematical chemistry, being a useful tool in generating structureproperty correlations for chemical compounds. We present recent results on extremal graphs with respect to some degreebased topological indices, such as the general sumconnectivity index, multiplicative Zagreb indices or NarumiKatayama index.
Ioan Todincă (Orleans, Franţa): Cum sa decupezi un graf: descompuneri arborescente şi alte aplicaţii
Rezumat: Multe probleme algoritmice interesante si importante in teoria grafurilor sint NPhard: colorare, clica maxima, travelling salesman s.a.m.d. O buna parte din cercetarea fundamentala in algorithmica este consacrata tehnicilor de abordare a problemelor de optimizare (NP) dificile: algoritmi de aproximare, algoritmi parametrati (FPT), euristici.
Voi vorbi destre o metoda de descompunere a grafurilor care consista a « decupa » graful in « bucati » ce se imbina intro structura arborescenta. Fiecarui graf i se poate asocia un parametru numit treewidth (largime arborescenta?). Cu cit acest parametru e mai mic cu atit mai mult putem considera ca graful « seamana » cu un arbore. Majoritatea problemelor NPhard pot fi rezolvate relativ eficient pe grafurile cu treewidth mic: algoritmii sint exponentiali in parametrul treewidth, dar liniari in numarul de virfuri.
Voi arata cum se poate obtine o astfel de descompunere arborescenta optima, folosind separatori minimali.
Bogdan Warinschi (Bristol) Cryptographic techniques for electronic voting
TBA.